The Descartes Labs Platform is designed to answer some of the world’s most complex and pressing geospatial analytics questions. Our customers use the platform to build algorithms and models that transform their businesses quickly, efficiently, and cost-effectively.
By giving data scientists and their line-of-business colleagues the best geospatial data and modeling tools in one package, we help turn AI into a core competency.
Data science teams can use our scaling infrastructure to design models faster than ever, using our massive data archive or their own.
Please visit https://descarteslabs.com for more information about the Descartes Labs Platform and to request access.
The descarteslabs python package, available at https://pypi.org/project/descarteslabs/, provides client-side access to the Descartes Labs Platform for our customers. You must be a registered customer with access to our Descartes Labs Platform before you can make use of this package with our platform.
The documentation for the latest release can be found at https://docs.descarteslabs.com. For any issues please request Customer Support at https://support.descarteslabs.com.
- Support for Python 3.9 has been dropped, as it is reaching end of life an many dependencies no longer support it.
- Updated copyright message everywhere.
- To prepare for multi-audience support in the clients, support for new custom claims in tokens has been added so that the user's unique namespace, which serves as a global id for the user, can be determined without resort to computing it on the fly.
- Fixed a bug where some geometries weren't supported by blob geometry properties.
- Breaking change in the past, regardless of the ordering of columns when a
Tablewas created, or the ordering of columns in theTableOptions, the result of a feature query would always place theuuidcolumn last. As of this version, the behavior has been modified to preserve the ordering of columns, so that if no column list is provided in the options, theuuidcolumn will appear in the position it has in the underlying table, as per theTable.columnsproperty. Similarly if a query specifies an explicit list of columns in theTableOptions,uuidwill appear in the same position in the result as it does in the supplied list of columns. However in keeping with prior behavior, if an explicit list of columns does not includeuuid, it will always be added automatically at the end and will appear in the last column of the result. - The
ilikeproperty_filteringexpression type has been added to support case-insensitive wildcard matching of Vector Table column values.
- Fixed a problem with unpickling Catalog objects pickled with an earlier version. Please be aware that we do not support the pickling of any Catalog objects, so if in doubt, don't do it!
- The links for interacting with login and token generation have been updated to refer to
https://app.descarteslabs.com.
- All
CatalogObjectclasses which support theowners,writers, andreadersfields have been refactored to derive this support from the newAuthCatalogObject. This change does not affect the behavior of any of these classes. The methodsAuthCatalogObject.user_is_owner(),AuthCatalogObject.user_can_write(), andAuthCatalogObject.user_can_read()have been added to allow testing of permissions prior to attempting an operation such as updating or deleting the object. EventSchedulenow has a read-onlyexpiresattribute which indicates when the schedule will be expired and deleted.EventSubscriptionnow has a read-onlyowner_role_arnfield which contains the AWS Role which will be used by certainEventRuletargets that reside in an external AWS account.EventRulehas been enhanced to support SQS Queue targets.- Several new helper classes for use with
EventSubscriptionare now supported:EventSubscriptionSqsTarget,NewImageEventSubscription,NewStorageEventSubscription,NewVectorEventSubscription, andComputeFunctionCompletedEventSubscription. The latter supports events generated by the Compute service as described below.
- The Compute service now generates a
compute-function-completedevent every time the number of outstanding (pending or running) jobs transitions to 0, akin to theFunction.wait_for_completion()method. These can be used with the Catalog service events support to trigger other operations.
- Support for Python 3.8 has been removed
- Support for Python 3.12 has been added
- Some dependencies have been updated due to security flaws
- The dependency on
pytzhas been removed in favor of the standardzoneinfopackage - Minor changes and additions to the client exception hierarchy so that ClientErrors and ServerErrors are not conflated in the retry support.
- The Catalog now provides support for Events, essentially notifications of new or updated assets in the Catalog, including images and storage blobs. Additionally, scheduled calendar-based events can be defined. You can subscribe to these events to trigger running a Compute function of your choice. This makes it possible to set up automated processing of new imagery. See the [https://docs.descarteslabs.com/guides/catalog.html](Catalog Guide) and API documentation for more information.
- Due to declining support for Python 3.8 across the ecosystem, we have discontinued our support for Python 3.8. It is expected that the client will continue to function until Python 3.8 is End of Life (October 2024), but we can no longer test new releases against this version.
- The Catalog Storage Blob deletion methods have been enhanced to support waiting for completion of the operation. When a blob is deleted, it is removed immediately from the catalog and a background asynchronous task is launched to clean up the contents of the blob from the backing storage. If a blob is deleted and then a new blob with the identical id is immediately created and uploaded before this background task completes, it is possible for the background task to end up deleting the new blob contents. As of this release the
Blobinstance and class delete methods return aBlobDeletionTaskStatusobject which provides await_for_completionmethod which can be used to wait until the background task completes and it is safe to create a new blob with the same id. For theBlob.delete_manymethod, thewait_for_completion=Trueparameter can be used to wait for all the supplied blobs to be completely deleted. Note that in the case of theBlob.deleteclass method, this is a very slight breaking change, as it used to return True or False, and now instead returns aBlobDeletionTaskStatusorNone, which have the same truthiness and hence are very likely to behave identically in practical use.
Bugfix only
- The
descarteslabsclient CLI script generated by the installation process was broken. Now it works!
A very minor release with some obscure bug fixes.
- The
descarteslabsclient CLI has had an overhaul. Gone is the obsolete support for the Raster client, and added is support for querying Catalog Products, Bands, and Blobs and managing sharing for the same. - Minor fixes to the authorization flow on login.
- Add testing of Blobs.
- Corrected regular expressions used to parse the
memoryargument to theFunctionconstructor. - Improved documentation of the the
cpusandmemoryarguments to theFunctionconstructor.
- Fixed a bug in seldom-used code to clear client state causing an import failure.
- Fixed a bug in
Table.visualize()which could cause Authorization (401) errors when rendering tiles into anipyleaflet.Map.
- Bumped some minimum dependency version constraints to avoid security vulnerabilities.
- Fixed a bug in
Table.visualize()that was causing it to fail.
Due to a number of breaking changes, the version has been bumped to 3.0.0. However, the vast majority of typical use patterns in typical user code will not require changes. Please review the specifics below.
- The
tagsattributes on Catalog objects can now contain up to 32 elements, each up to 1000 characters long. But why would you even want to go there? - Breaking Change: Derived bands, never supported in the AWS environment and catalog products, have been removed.
- The new
Blob.delete_manymethod may be used to delete large numbers of blobs efficiently. - The
Blob.get_or_createmethod didn't allow supplyingstorage_type,namespace, ornameparameters. Now it works as expected, either returning a saved Blob from the Catalog, or an unsaved blob that you can use to upload and save its data. - Image methods
ndarrayanddownloadno longer pass the image's default geocontext geometry as a cutline. This is to avoid problems when trying to raster a complete single image in its native CRS and resolution where imperfect geometries (due to a simplistic projection to EPSG:4326) can cause some boundary pixels to be masked. When passing in an explicitGeoContextto these methods, consider whether any cutline geometry is required or not, to avoid these issues.
FunctionandJobobjects now have a newenvironmentattribute which can be used to define environment variables for the jobs when they are run.- Breaking Change: The
Function.mapmethod previously had no bound on how many jobs could be created at one time. This led to operational problems with very large numbers of jobs. Now it submits jobs in batches (up to 1000 jobs per batch) to avoid request timeouts, and is more robust on retryable errors so that duplicate jobs are not submitted accidently. There is still no bound on how many jobs you may create with a single call toFunction.map. Additionally, since it is possible that some jobs may be successfully submitted, and others not, the return value, while still behaving as a list ofJobs, is now aJobBulkCreateResultobject which has ais_successand anerrorproperty which can be used to determine if all submissions were successful, what errors may have occurred, and what jobs have actually been created. Only if the first batch fails hard will the method raise an exception. - The
Job.statisticsmember is now typed as aJobStatisticsobject. - The efficiency of deleting many jobs at once has been significantly improved using
Function.deleteandFunction.delete_jobs. It is still possible to encounter request timeouts with very large numbers of jobs; workarounds are now documented in the API documentation for theFunction.delete_jobsmethod. - The
ComputeClient.check_credentialsmethod has been added, so that the client can determine if valid user credentials have already been registered with the Compute service.
- The Vector client library, previously available as the
descarteslabs-vectorpackage on PyPI, has now been integrated into the Descartes Labs Python Client (this package). It should no longer be installed separately. - Visualization support (
ipyleaflet.Map) is enabled whenipyleafletis available. It is not installed by default, but can be installed manually, or by installing thedescarteslabspython client with thevizextra (e.g.pip install descarteslabs[viz]). Note that in order to be compatible with jupyterlab notebooks, thevisualize()method no longer returns the layer, it just adds it to the supplied map. - The Vector package now has a
VectorClientAPI client, with the usual support forget_default_client()andset_default_client(). Most constructors and methods now accept an optionalclient=parameter if you need to use something other than the default client. - Configuration is now accomplished using the standard
descarteslabs.configpackage. In particular, thevector_urlsetting is used to specify the default Vector host. TheVECTOR_API_HOSTenvironment variable is no longer consulted. - Vector client methods now raise standard
descarteslabs.exceptionsException classes rather than thedescarteslabs.vector.vector_exceptionsclasses of the old client. - The
is_spatial=parameter previously accepted by many methods and functions is now deprecated and ignored. It is not required because existing type information always determines if an operation is spatial or not. Warnings will be generated if it is used. - Be advised that feature upload and download (query) do not currently support or impose any limits, and thus allow operations so large and slow that timeouts or other failures may occur. A future version will implement limits and batching, so that large operations can be supported reliably. Until then, the user may wish to implement their own batching were possible to avoid encountering network limits and timeouts.
- The old client version v1.12.1 is reaching end of life and will longer be supported as of February 2024. You can expect the version to stop working at any point after that as legacy backend support is turned off.
- Breaking Change: The deprecated
Scenesclient API has been removed. - Breaking Change: The deprecated
Metadataclient API has been removed. - The minimum required version of
urllib3has been bumped to 1.26.18 to address a security vulnerability. - The minimum required version of
shapelyhas been bumped to 2.0.0 to address thread safety issues. - Python 3.7, formerly deprecated, is no longer supported.
- Python 3.12 is not yet officially supported due to the lack of support from
blosc. However, if you are able to provide a functionalbloscon your own, then 3.12 should work. - Urllib3 2.X is now supported.
- Geopandas, Pydantic, and PyArrow have been added as core dependencies to support the Vector client.
- For those users of the
clear_client_statefunction (not common), the bands cache for the Catalog client is now cleared also.
Function.delete_jobswas failing to implement thedelete_resultsparameter, so job result blobs were not being deleted. This has been fixed.- Add
delete_resultsparameter toFunction.deletefor consistency. Job.statisticsfield added which contains statistics (CPU, memory, and network utilization) for the job. This can be used to determine the minimal resources necessary for theFunctionafter some representative runs.
- Filtering on datetime attributes (such as
Function.created_at) didn't previously work with anything butdatetimeinstances. Now it also handles iso format strings and unix timestamps (int or float).
- Following our lifecycle policy, client versions v1.11.0 and earlier are no longer supported. They may cease to work with the Platform at any time.
- The Catalog
Blobclass now has aget_data()method which can be used to retrieve the blob data directly given the id, without having to first retrieve theBlobmetadata.
-
Breaking Change The status values for
FunctionandJobobjects have changed, to provide a better experience managing the flow of jobs. Please see the updated Compute guide for a full explanation. Because of the required changes to the back end, older clients (i.e. v2.0.3) are supported in a best effort manner. Upgrading to this new client release is strongly advised for all users of the Compute service. -
Breaking Change The base images for Compute have been put on a diet. They are now themselves built from "slim" Python images, and they no longer include the wide variety of extra Python packages that were formerly included (e.g. TensorFlow, SciKit Learn, PyTorch). This has reduced the base image size by an order of magnitude, making function build times and job startup overhead commensurately faster. Any functions which require such additional packages can add them in as needed via the
requirements=parameter. While doing so will increase image size, it will generally still be much smaller and faster than the prior "Everything and the kitchen sink" approach. Existing Functions with older images will continue to work as always, but any newly minted `Function`` using the new client will be using one of the new slim images. -
Base images are now available for Python3.10 and Python3.11, in addition to Python3.8 and Python3.9.
-
Job results and logs are now integrated with Catalog Storage, so that results and logs can be searched and retrieved directly using the Catalog client as well as using the methods in the Compute client. Results are organized under
storage_type=StorageType.COMPUTE, while logs are organized understorage_type=StorageType.LOGS. -
The new
ComputeResultclass can be used to wrap results from aFunction, allowing the user to specify additional attributes for the result which will be stored in the CatalogBlobmetadata for the result. This allows the function to specify properties such asgeometry,description,expires,extra_attributes,writersandreadersfor the resultBlob. The use ofComputeResultis not required. -
A
Jobcan now be assigned arbitrary tags (strings), and searched based on them. -
A
Jobcan now be retried on errors, and jobs track error reasons, exit codes, and execution counts. -
FunctionandJobobjects can now be filtered by class attributes (ex.Job.search().filter(Job.status == JobStatus.PENDING).collect()). -
The
Job.cancel()method can now be used to cancel the execution of a job which is currently pending or running. Pending jobs will immediately transition toJobStatus.CANCELEDstatus, while running jobs will pass throughJobStatus.CANCEL(waiting for the cancelation to be signaled to the execution engine),JobStatus.CANCELING(waiting for the execution to terminate), andJobStatus.CANCELED(once the job is no longer executing). Cancelation of running jobs is not guaranteed; a job may terminate successfully, or with a failure or timeout, before it can be canceled. -
The
Job.result()method will raise an exception if the job does not have a status ofJobStatus.SUCCESS. IfJob.result()yields anNonevalue, this means that there was no result (i.e. the execution returned aNone). -
The
Job.result_blob()method will return the Catalog Storage Blob holding the result, if any. -
The
Job.delete()method will delete any job logs, but will not delete the job result unless thedelete_resultsparameter is supplied. -
The
Functionobject now has attributesnamespaceandowner. -
The
Function.wait_for_completion()and newFunction.as_completed()methods provide a richer set of functionality for waiting on and handling job completion. -
The
Function.build_log()method now returns the log contents as a string, rather than printing the log contents. -
The
Job.log()method now returns the log contents as a list of strings, rather than printing the log contents. Because logs can be unbounded in size, there's also a newJob.iter_log()method which returns an iterator over the log lines. -
The
requirements=parameter toFunctionobjects now supports morepipmagic, allowing the use of specialpipcontrols such as-f. Also parsing of package versions has been loosened to allow some more unusual version designators. -
Changes to the
Function.map()method, with the parameter name change ofiterargschanged tokwargs(the old name is still honored but deprecated), corrected documentation, and enhancements to support more general iterators and mappings, allowing for a more functional programming style. -
The compute package was restructured to make all the useful and relevant classes available at the top level.
- Property filters can now be deserialized as well as serialized.
- Allow deletion of
Functionobjects.- Deleting a Function will deleted all associated Jobs.
- Allow deletion of
Jobobjects.- Deleting a Job will delete all associated resources (logs, results, etc).
- Added attribute filter to
FunctionandJobobjects.- Attributes marked
filterable=Truecan be used to filter objects on the compute backend api. - Minor optimization to
Job.iter_resultswhich now uses backend filters to load successful jobs.
- Attributes marked
Functionbundling has been enhanced.- New
include_modulesandinclude_dataparameters allow for multiple other modules, non-code data files, etc to be added to the code bundle. - The
requirementsparameter has been improved to allow a user to pass a path to their ownrequirements.txtfile instead of a list of strings.
- New
- Allow data type
int32in geotiff downloads. BlobCollectionnow importable fromdescarteslabs.catalog.
- Added API documentation for dynamic compute and vector
- Due to recent changes in
urllib3, rastering operations were failing to retry certain errors which ought to be retried, causing more failures to propagate to the user than was desirable. This is now fixed.
(Release notes from all the 2.0.0 release candidates are summarized here for completeness.)
- Deprecated support for Python 3.7 (will end of life in July).
- Added support for Python 3.10 and Python 3.11
- AWS-only client. For the time being, the AWS client can be used to communicate with the legacy GCP platform (e.g.
DESCARTESLABS_ENV=gcp-production) but only supports those services that are supported on AWS (catalogandscenes). This support may break at any point in the future, so it is strictly transitional.
- Removed many dependencies no longer required due to the removal of GCP-only features.
- Added support for Shapely 2.X. Note that user code may also be affected by breaking changes in Shapely 2.X. Use of Shapely 1.8 is still supported.
- Updated requirements to avoid
urllib3>=2.0.0which breaks all kinds of things.
- Major overhaul of the internals of the config process. To support other clients using namespaced packages within the
descarteslabspackage, the top level has been cleaned up, and most all the real code is down insidedescarteslabs.core. End users should never have to import anything fromdescarteslabs.core. No more magic packages means thatpylintwill work well with code usingdescarteslabs. - Configuration no longer depends upon the authorized user.
- Added support for data storage. The
Blobclass provides mechanism to upload, index, share, and retrieve arbitrary byte sequences (e.g. files).Blobs can be searched by namespace and name, geospatial coordinates (points, polygons, etc.), and tags.Blobs can be downloaded to a local file, or retrieved directly as a Pythonbytesobject.Blobs support the same sharing mechanisms asProducts, withowners,writers, andreadersattributes. - Added support to
Propertyforprefixfiltering. - The default
geocontextfor image objects no longer specifies aresolutionbut rather ashape, to ensure that default rastering preserves the original data and alignment (i.e. no warping of the source image). - As with
resolution, you can now pass acrsparameter to the rastering methods (e.g.Image.ndarray,ImageCollection.stack, etc.) to override thecrsof the default geocontext. - A bug in the code handling the default context for image collections when working with a product with a CRS based on degrees rather than meters has been fixed. Resolutions should always be specified in the units used by the CRS.
- Added support for managed batch compute under the
computemodule.
- Fixed a bug in the handling of small blocks (less than 512 x 512) that caused rasterio to generate bad download files (the desired image block would appear as a smaller sub-block rather than filling the resulting raster).
- The defaulting of
align_pixelshas changed slightly for theAOIclass. Previously it always defaulted toTrue. Now the default isTrueifresolutionis set,Falseotherwise. This ensures that when specifying ashapeand aboundsrather than a resolution,theshapeis actually honored. - When assigning a
resolutionto anAOI, any existingshapeattribute is automatically unset, since the two attributes are mutually exclusive. - The validation of bounds for a geographic CRS has been slightly modified to account for some of the irregularities of whole-globe image products, correcting unintended failures in the past.
- Fixed problem handling MultiPolygon and GeometryCollection when using Shapely 2.0.
- Loosen up the restrictions on the allowed alphabet for Blob names. Now almost any printable character is accepted save for newlines and commas.
- Added new storage types for Blobs:
StorageType.COMPUTE(for Compute job results) andStorageType.DYNCOMP(for saveddynamic-computeoperations).
- Added testing of the client.
- The defaulting of the
namespacevalue forBlobs has changed slightly. If no namespace is specified, it will default to<org>:<hash>with the user's org name and unique user hash. Otherwise, any other value, as before, will be prefixed with the user's org name if it isn't already so. Blob.getno longer requires a full id. Alternatively, you can give it anameand optionally anamespaceand astorage_type, and it will retrieve theBlob.- Fixed a bug causing summaries of
Blobsearches to fail.
Function.mapandFunction.rerunnow save the createdJobs before returning.Job.getreturn values fixed, and removed an extraneous debug print.
- Updated requirements to avoid
urllib3>=2.0.0which break all kinds of things.
- The defaulting of
align_pixelshas changed slightly for theAOIclass. Previously it always defaulted toTrue. Now the default isTrueifresolutionis set,Falseotherwise. This ensures that when specifying ashapeand aboundsrather than a resolution,theshapeis actually honored. - When assigning a
resolutionto anAOI, any existingshapeattribute is automatically unset, since the two attributes are mutually exclusive. - The validation of bounds for a geographic CRS has been slightly modified to account for some of the irregularities of whole-globe image products, correcting unintended failures in the past.
- The default
geocontextfor image objects no longer specifies aresolutionbut rather ashape, to ensure that default rastering preserves the original data and alignment (i.e. no warping of the source image). - The
Blob.uploadandBlob.upload_datamethods now returnself, so they can be used in a fluent style. - As with
resolution, you can now pass acrsparameter to the rastering methods (e.g.Image.ndarray,ImageCollection.stack, etc.) to override thecrsof the default geocontext.
- A bevy of fixes to the client.
- Added support for data storage. The
Blobclass provides mechanism to upload, index, share, and retrieve arbitrary byte sequences (e.g. files).Blobs can be searched by namespace and name, geospatial coordinates (points, polygons, etc.), and tags.Blobs can be downloaded to a local file, or retrieved directly as a Pythonbytesobject.Blobs support the same sharing mechanisms asProducts, withowners,writers, andreadersattributes. - Added support to
Propertyforprefixfiltering.
- Added method to update user credentials for a
Function. - Added methods to retrieve build and job logs.
- Added support for Shapely=2.X.
- This is an internal-only release. There is as of yet no updated documentation. However, the user-facing client APIs remain fully compatible with v1.12.1.
- Added support for managed batch compute under the
computemodule.
- Removed the check on the Auth for configuration, since it is all AWS all the time.
- Fixed a bug in the handling of small blocks (less than 512 x 512) that caused rasterio to generate bad download files (the desired image block would appear as a smaller sub-block rather than filling the resulting raster).
- This is an internal-only release. There is as of yet no updated documentation. However, the user-facing client APIs remain fully compatible with v1.12.1.
- Deprecated support for Python 3.7 (will end of life in July).
- Added support for Python 3.10 and Python 3.11
- AWS-only client. For the time being, the AWS client can be used to communicate with the legacy GCP platform (e.g.
DESCARTESLABS_ENV=gcp-production) but only supports those services that are supported on AWS (catalogandscenes). This support may break at any point in the future, so it is strictly transitional.
- Removed many dependencies no longer required due to the removal of GCP-only features.
- Major overhaul of the internals of the config process. To prepare for supporting other clients using namespaced packages within the
descarteslabspackage, the top level has been cleaned up, and most all the real code is down insidedescarteslabs.core. However end users should never have to import anything fromdescarteslabs.core. No more magic packages means thatpylintwill work well with code usingdescarteslabs. - GCP environments only support
catalogandscenes. All other GCP-only features have been removed.
- A bug in the code handling the default context for image collections when working with a product with a CRS based on degrees rather than meters has been fixed. Resolutions should always be specified in the units used by the CRS.
- Fixed a bug causing
descarteslabs.workflows.map.geocontext()to fail with an import error. This problem also affected the autoscaling feature of workflows map layers.
- Fixed a bug causing downloads of single-band images to fail when utilizing rasterio.
- Catalog V2 is now fully supported on the AWS platform, including user ingest.
- Catalog V2 has been enhanced to provide substantially all the functionality of the Scenes API. The
Imageclass now includes methods such asndarrayanddownload. A newImageCollectionclass has been added, mirroringSceneCollection. The variousSearchobjects now support a newcollectmethod which will return appropriateCollectiontypes (e.g.ProductCollection,BandCollection, and of courseImageCollection). Please see the updated Catalog V2 guide and API documentation for more details. - Previously, the internal implementation of the
physical_rangeattribute on various band types was inconsistent with that ofdata_rangeanddisplay_range. It has now been made consistent, which means it will either not be set, or will contain a 2-tuple of float values. It is no longer possible to explicitly set it toNone. - Access permissions for bands and images are now managed directly by the product. The
readers,writers, andownersattributes have been removed from all the*Bandclasses as well as theImageclass. Also theProduct.update_related_objects_permissionsandProduct.get_update_permissions_statusmethods have been removed as these are no longer necessary or supported. - All searches for bands (other than derived bands) and images must specify one or more product ids in the filtering.
This requirement can be met by using the
bands()andimages()methods of a product to limit the search to that product, or through afilter(properties.product_id==...)clause on the search. - Products have a new
product_tierattribute, which can only be set or modified by privileged users. - The
Image.upload_ndarraywill now accept either an ndarray or a list of ndarrays, allowing multiple files per image. The band definitions for the product must correspond to the order and properties of the multiple ndarrays.
- With the addition of the Scenes functionality to Catalog V2, you are strongly encouraged to migrate your Scenes-based code to use Catalog V2 instead. Scenes will be deprecated in a future release. Some examples of migrating from Scenes to Catalog V2 are included in the Catalog V2 guide. In the meantime the Scenes API has been completely reimplemented to use Catalog V2 under the hood. From a user perspective, existing code using the Scenes API should continue to function as normal, with the exception of a few differences around some little-used dark corners of the API.
- The Scenes
search_bandsnow enforces the use of a non-emptyproducts=parameter value. This was previously documented but not enforced.
- With the addition of the Scenes functionality to Catalog V2, you are strongly encouraged to migrate your Metadata-based code to use Catalog V2 instead. Metadata will be deprecated in a future release.
- As with Catalog and Scenes, one or more products must now be specified when searching for bands or images.
- The Raster client API now requires a
bands=parameter for all rastering operations, such asraster,ndarrayandstack. It no longer defaults to all bands defined on the product.
- An off-by-1/2-pixel problem was identified in the coordinate transforms underlying
DLTile.rowcol_to_latlonandDLTile.latlon_to_rowcol. The problem has been corrected, and you can expect to see slight differences in the results of these two methods.
- All the REST client types, such as
MetadataandRaster, now supportget_default_client()andset_default_client()instances. This functionality was previously limited to the Catalog V2CatalogClient. Whenever such a client is required, the client libraries useget_default_client()rather than using the default constructor. This makes it easy to comprehensively redirect the library to use a specially configured client when necessary.
- The
GeoContexttypes that originally were part of the Scenes package are now available in the newdescarteslabs.geopackage, with no dependencies on Scenes. This is the preferred location from which to import these classes.
- The
descarteslabs.utilspackage, added in the previous release for the AWS client only, now exists in the GCP client as well, and is the preferred location to pick up theDotDictandDotListclasses, thedisplayandsave_imagefunctions, and thePropertiesclass for property filtering in Catalog V2. - The
displaymethod now has added support for multi-image plots, see the API documentation for thefigsize,nrows,ncolsandlayout_directionparameters.
- The
property_filtering.GenericPropertiesclass has been replaced withproperty_filtering.Properties, but remains for back compatibility. - Property filters now support
isnullandisnotnulloperations. This can be very useful for properties which may or may not be present, e.g.properties.cloud_fraction.isnull | properties.cloud_fraction <= 0.2.
- The
ConfigexceptionsRuntimeErrorandKeyErrorwere changed toConfigErrorexceptions fromdescarteslabs.exceptions. Authnow retrieves its URL from theConfigsettings. If no valid configuration can be found, it reverts to the commercial service (https://app.descarteslabs.com).
- Dependencies for the descarteslabs library have been updated, but remain constrained to continue to support Python 3.7.
- Numerous bug fixes.
- The extra requirement options have changed. There are four extra requirement options now,
visualization,tables,complete, andtests.visualizationpulls in extra requirements to support operating in a Jupyter notebook or environment, enabling interactive maps and graphical displays. It is not required for operating in a "headless" manner.tablespulls in extra requirements to support theTablesclient.completeis the combination ofvisualizationandtables.testspulls in extra requirements for running the tests. As always,pip install 'descarteslabs[complete]'will install a fully enabled client.
- The Descartes Labs client now supports configuration to support operating in different environments. By default,
the client will configure itself for standard usage against the GCP platform (
"gcp-production"), except in the case of AWS Marketplace users, for whom the client will configure itself against the AWS platform ("aws-production"). Alternate environments can be configured by setting theDESCARTESLABS_ENVenvironment variable before starting python, or by using a prelude likebefore any other imports of any part of the descarteslabs client package.from descarteslabs.config import Settings Settings.select_env("environment-name") - The new AWS Enterprise Accelerator release currently includes only Auth, Configuration and the Scenes client.
- The
descarteslabs.client.authpackage has moved todescarteslabs.auth. It is now imported into the original location atdescarteslabs.client.authto continue to work with existing code, but new code should use the new location. - The
descarteslabs.client.exceptionsmodule has moved todescarteslabs.exceptions. It is now imported into the original location atdescarteslabs.client.exceptionsto continue to work with existing code, but new code should use the new location.
- Fixed an issue in
scenes.DLTile.from_shapewhere there would be incomplete coverage of certain geometries. The function may now return more tiles than before. - Added support for the new
all_touchedparameter to the differentGeoContexttypes. Default behavior remains the same as always, but if you setall_touched=Truethis communicates to the raster service that you want the image(s) rastered using GDAL'sCUTLINE_ALL_TOUCHEDoption which will change how source pixels are mapped to output pixels. This mode is only recommended when using an AOI which is smaller than the source imagery pixel resolution. - The DLTile support has been fixed to avoid generating gaps when tiling regions that span
a large distance north-to-south and straddle meridians which are boundaries between
UTM zones. So methods such as
DLTile.from_shapemay return more tiles than previously, but properly covering the region. - Added support for retrieving products and bands.
- Methods added:
get_product,get_band,get_derived_band,search_products,search_bands,search_derived_bands. - Disallows search without
productsparameter.
- Methods added:
- Scaling support has been enhanced to understand processing levels for newer products. The
Scene.scaling_parametersandSceneCollection.scaling_parametersmethods now accept aprocessing_levelargument, and this will be factored in to the determination of the default result data type and scaling for all rastering operations such asScene.ndarrayandSceneCollection.mosaic. - If the user provides the
rasteriopackage (which implies providing GDAL), then rasterio will be used to save any downloaded images as GeoTIFF, allowing for the use of compression. Otherwise, by default thetifffilesupport will be used to generate the GeoTIFF files but compression is not supported in this mode. - As the Places client has been deprecated, so has any use of the
place=parameter supported by several of the Scenes functions and methods.
- (Core users only) Added support for specifying the image index to use when creating a new
Product. - Added support for defining per-processing-level
data_type,data_range,display_rangeandphysical_rangeproperties on processing level steps.
- Added support for filtering
Assetsby type and name fields.- Supported filter types
blob,folder,namespace,sym_link,sts_model, andvector. Specifying multiple types will find assets matching any given type. - The name field supports the following wildcards:
*matches 0 or more of any character.?matches 1 of any character.
- Find assets matching type of
bloband having a display name offile name.jsonorfile2name.txtbut notfilename.json:Discover().list_assets("asset/namespace/org:some_org", filters="type=blob&name=file?name.*")Discover().list_assets("asset/namespace/org:some_org", filters=AssetListFilter(type=AssetType.BLOB, name="file?name.*"))
- Find assets of type
bloborvector:Discover().list_assets("asset/namespace/org:some_org", filters="type=blob,vector")Discover().list_assets("asset/namespace/org:some_org", filters=AssetListFilter(type=[AssetType.BLOB, AssetType.VECTOR], name="file?name.*"))
- Supported filter types
Metadata.productsandMetadata.available_productsnow properly implement paging so that by default, a DotList containing every matching product accessible to the user is returned.
- If the user provides the
rasteriopackage (which implies providing GDAL), then rasterio will be used to save any downloaded images as GeoTIFF, allowing for the use of compression. Otherwise, by default thetifffilesupport will be used to generate the GeoTIFF files but compression is not supported in this mode.
- Fixed an issue that caused a user's schema to be overwritten if they didn't provide a primary key on table creation.
- Now uses Discover backend filtering for
list_tables()instead of filtering on the client to improve performance. list_tables()now supports filtering tables by nameTables.list_tables(name="Test*.json")
- New Tasks images for this release bump the versions of several dependencies, please see the Tasks guide for detailed lists of dependencies.
- The new Workbench release bumps the versions of several dependencies.
- Added support for the new
all_touchedparameter to the differentGeoContexttypes. See description above underScenes.
- The Places client has been deprecated, and use thereof will generate a deprecation warning.
- The older Catalog V1 client has been deprecated, and use thereof will generate a deprecation warning. Please use the Catalog V2 client in its place.
- Documentation has been updated to include the `AWS Enterprise Accelerator" release.
- With Python 2 far in the rearview mirror, the depedencies on the
sixpython package have been removed throughout the library, the distribution and all tasks images.
- Added support for Python 3.9.
- Removed support for Python 3.6 which is now officially End Of Life.
- Added support for organizational sharing. You can now share using the
Organizationtype:workflows.add_reader(Organization("some_org"))
- Added support for organizational sharing. You can now share using the
Organizationtype:asset.share(with_=Organization("some_org"), as_="Viewer")
- Allow user to list their organization's namespace.
Discover().list_asset("asset/namespace/org:some_org")
- Allow user to list their organization's users.
Discover().list_org_users()
- Added an alpha Tables client. The Tables module lets you organize, upload, and query tabular data and vector geometries. As an alpha release, we reserve the right to modify the Tables client API without any guarantees about backwards compatibility. See the Tables API and Tables Guide documentation for more details.
- Added the
progress=parameter to the various rastering methods such asScene.ndarray,Scene.download,SceneCollection.mosaic,SceneCollection.stack,SceneCollection.downloadandSceneCollection.download_mosaic. This can be used to enable or disable the display of progress bars.
- Support for Python 3.9 images has been added, and support for Python 3.6 images has been removed.
- Many of the add on packages have been upgraded to more recently released versions. In particular,
tensorflowwas updated from version 2.3 to version 2.7. - GPU support was bumped up from CUDA 10 to CUDA 11.2
- Fixed a bug preventing retry-able errors (such as a 429) from being retried.
- Allow retrieving Attribute as a class attribute. It used to raise an exception.
- Fixed a bug preventing the user from writing JPEG files with smaller than 256x256 tiles.
- Allow specifying a
NoDatavalue for non-JPEG GeoTIFF files. - Include band description metadata in created GeoTIFF files.
- Support scaling parameters as lists as well as tuples.
- Add caching of band metadata to drastically reduce the number of metadata queries when creating
SceneCollections. DLTiles.from_shapewas failing to handle shape objects implementing the__geo_interface__API, most notably several of the WorkflowsGeoContexttypes. These now work as expected.- Certain kinds of network issues could read to rastering operations raising an
IncompleteReadexception. This is now correctly caught and retried within the client library.
- Users can now use
descarteslabs.tasks.update_credentials()to update their task credentials in case they became outdated.
- We have introduced a hard limit of 120 as the number of outstanding Workflows compute jobs that a single user can have. This limit exists to minimize situations in which a user is unable to complete jobs in a timely manner by ensuring resources cannot be monopolized by any individual user. The API that backs the calls to
computewill return adescarteslabs.client.grpc.exceptions.ResourceExhaustederror if the caller has too many outstanding jobs. Prior to this release (1.9.0), these failures would be retried up to some small retry limit. With the latest client release however, the client will fail without retrying on an HTTP 429 (rate limit exceeded) error. For users with large (non-interactive) workloads who don’t mind waiting, we added a newnum_retriesparameter to thecomputefunction; when specified, the client will handle any 429 errors and retry up tonum_retriestimes. - Workflows is currently optimized for interactive use cases. If you are submitting large numbers of long-running Workflows compute jobs with
block=False, you should consider using Tasks and Scenes rather than the Workflows API. - Removed
ResourceExhaustedexceptions from the list of exceptions we automatically catch and retry on forcomputecalls.
- Lots of improvements, additions, and clarifications in the API documentation.
- Workflows client no longer validates
processing_levelparameter values, as these have been enhanced to support new products and can only be validated server side. - Catalog V2 bands now support the
vendor_band_namefield (known asname_vendorin Metadata/Catalog V1). - Scenes support for masking in version 1.8.1 had some regressions which have been fixed. For this reason, version 1.8.1 has been pulled from PyPI.
- New task groups now default to a
maximum_concurrencyvalue of 5, rather than the previous 500. This avoids the common problem of deploying a task group with newly developed code, and having it scale up and turning small problems into big problems! You may still set values as large as 500. - The Tasks client now provides an
update_group()method which can be used to update many properties of an existing task group, including but not limited toname,image,minimum_concurrency, andmaximum_concurrency. - Improved testing across several sub-packages.
- Various documentation fixes.
** Version Deprecated ** Due to some regressions in the Scenes API, this version has been removed from PyPI.
- Added a new
common.dltilelibrary that performs geospatial transforms and tiling operations. - Upgraded various dependencies:
requests[security]>=2.25.1,<3,six>=1.15.0,blosc==1.10.2,mercantile>=1.1.3,Pillow>=8.1.1,protobuf>=3.14.0,<4,shapely>=1.7.1,<2,tqdm>=4.32.1,traitlets>=4.3.3,<6;python_version<'3.7',traitlets==5.0.5,<6;python_version>='3.7',markdown2>=2.4.0,<3,responses==0.12.1,freezegun==0.3.12,imagecodecs>=2020.5.30;python_version<'3.7',imagecodecs>=2021.5.20;python_version>='3.7',tifffile==2020.9.3;python_version<'3.7',tifffile==2021.4.8;python_version>='3.7'
- Added an alpha Discover client. Discover allows users to organize and share assets with other users. As an alpha release, we reserve the right to modify the Discover client API without any guarantees about backwards compatibility. See the Discover API documentation for more details.
- breaking Image (Scene) metadata now accepts and returns the
bucketanddirectoryfields as lists of strings, of a length equal to that of thefilesfields. This allows the file assets making up an image to live in different locations. When creating new images, a simple string can still be provided for these fields. It will automatically be converted to a list of (duplicated) strings as necessary. As most users will never interact with these fields, the change should not affect user code.
derived_paramsfield for Image (scene) metadata now supported for product-specific service-implemented "native derived bands" which may only be created for core products.
- Scenes now uses the client-side
dltilelibrary to make DLTiles. This improves performance when creating a large number of DLTile objects. - Scenes DLTile
from_shapenow has a parameter to return tile keys only instead of full tile objects. Usage details can be found in the docs. - Scenes DLTile now has new methods:
iter_from_shapethat takes the same arguments asfrom_shapebut returns an iterator (from_shape docs),subtilethat adds the ability to subdivide tiles (subtile docs), androwcol_to_latlonandlatlon_to_rowcolwhich converts pixel coordinates to spatial coordinates and vice versa (rowcol_to_latlon docs and latlon_to_rowcol docs). - Scenes DLTile now has a new parameter
tile_extentwhich is the total size of the tile in pixels including padding. Usage details can be found in the docs. - breaking Removed the dependence on
Rasterfor tiling. Theraster_clientparameter has been removed from thefrom_latlon,from_key,from_shape, andassignDLTile methods. - Tiling using
from_shapemay return a different number of tiles compared to previous versions under certain conditions. These tiles are usually found in overlapping areas between UTM zones and should not affect the overall coverage. - DLTile geospatial transformations are guaranteed to be within eight decimal points of the past implementation.
- DLTile errors now come from the
dltilelibrary and error messages should now be more informative. - When specifying output bounds in a spatial reference system different from the underlying raster, a densified representation of the bounding box is used internally to ensure that the returned image fully covers the bounds. For certain methods (like
mosaic) this may change the returned image dimensions, depending on the SRSs involved. - breaking As with the Metadata v1 client changes, the
bucketanddirectoryfields of the Scene properties are now multi-valued lists. - Scenes does not support writing GeoTiffs to file-like objects. Non-JPEG GeoTiffs are always uncompressed.
dltiles_from_shape,dltiles_from_latlon, anddltilehave been removed. It is strongly recommended to test any existing code which uses the Raster API when upgrading to this release.- Fully masked arrays are now supported and are the default. Usage details can be found in the docs
- Added support to draw progress bar. Usage details can be found in the docs.
- The signature and return value of
Raster.raster()have changed. Thesave=parameter has been removed as the resulting download is always saved to disk, to a file named by theoutfile_basename=parameter. The method returns a tuple containing the name of the resulting file and the metadata for the retrieval, which is now an ordinary Python dictionary. - As with Scenes, when specifying output bounds in a spatial reference system different from the underlying raster, a densified representation of the bounding box is used internally to ensure that the returned image fully covers the bounds. For certain methods (like
mosaic) this may change the returned image dimensions, depending on the SRSs involved.
Internal release only. See 1.8.1 above.
- Upgraded various dependencies:
blosc==1.10.2,cachetools>=3.1.1,grpcio>=1.35.0,<2,ipyleaflet>=0.13.3,<1,protobuf>=3.14.0,<4,pyarrow>=3.0.0,pytz>=2021.1 - Upgraded from using Travis to GitHub Actions for CI.
- Added support for the
physical_rangeproperty onSpectralBandandMicrowaveBand.
- Workflows sharing. Support has been added to manage sharing of
Workflowobjects with other authorized users. Thepublicoption for publishing workflows has been removed now thatWorkflow.add_public_reader()provides the equivalent capability. See the Workflows Guide. - Lots of improvements to API documentation and the Workflows Guide.
- Allow constructing
Floatinstances from literal python integers.
Fixes a few buglets which slipped through. This release continues to use the workflows channel v0-18.
- Fixed a problem with the defaulting of the visual options when generating tile URLs, making it possible to toggle the checkerboard option on a layer and see the difference.
- Support
axis=list(...)forImage. - Corrected the results of doing arithmetic on two widgets (e.g. adding two
IntSliders together should yieldanInt`). - For single-band imagery
VizOptionwill accept a single two-tuple for thescales=argument.
- Python 3.6 is now deprecated, and support will be removed in the next version.
- Added support to Bands for new processing levels and processing step specifications to support Landsat Collection 2.
- The new channel
v0-18utilizes a new and improved backend infrastructure. Any previously saved workflows and jobs from earlier channels are not accessible from the new infrastructure, so you will need to recreate and persist (e.g. publish) new versions usingv0-18. Older releases and older channels can continue to access your originals if needed. wf.widgetslets you quickly explore data interactively. Add widgets anywhere in your code just like normal values, and the widgets will display automatically when you call.visualize.- View shared Workflows and XYZs in GIS applications using WMTS. Get the URL with
wf.wmts_url(),XYZ.wmts_url(),Workflow.wmts_url().- Create publicly-accessible tiles and WMTS endpoints with
wf.XYZ(..., public=True). Anyone with the URL (which is a cryptographically random ID) can view the data, no login required. Setdays_to_expirationto control how long the URL lasts. wf.XYZ.list()to iterate through all XYZ objects you've created, andXYZ.deleteto delete them.- Set default vizualization options (scales, colormap, bands, etc.) in
.publishorwf.XYZwithwf.VizOption. Theseviz_optionsare used when displaying the published object in a GIS application, or withwf.flows.
- Create publicly-accessible tiles and WMTS endpoints with
ImageCollection.visualize(): display ImageCollections onwf.map, and select the reduction operation (mean, median, mosaic, etc.) interactivelyImage.reduction()andImageCollection.reduction()(likeic.reduction("median", axis="images")) to reduce an Image/ImageCollection with an operation provided by namewf.map.controlsis accessible (you had to dowf.map.map.controlsbefore)- Access the parameters used in a Job with
Job.argumentsandJob.geoctx.
- Errors like
In 'or': : operand type(s) all returned NotImplemented from __array_ufunc__when using the bitwise-or operator|are resolved. - Errors when using computed values in the
wf.Datetimeconstructor (likewf.Datetime(wf.Int(2019) + 1)) are resolved. wf.Timedeltacan be constructed from floats, and supports all binary operations that Python does (support for/, //, %, *added)- In
.rename_bands, prohibit renaming a band to a name that already exists in the Image/ImageCollection. Previously, this would succeed, but cause downstream errors. .bandinfo.get("bandname", {})now works---before, providing{}would fail with a TypeError- Indexing an
Anyobject (likewf.Any({"foo": 1})["foo"]) behaves correctly wf.Datetimes constructed from strings containing timezone information are handled correctly
.mask(new_mask)ignores masked pixels innew_mask. Previously, masked pixels innew_maskwere considered True, not False. Note that this is opposite of NumPy's behavior.- If you
.publishan object that depends onwf.parameters orwf.widgets, it's automatically converted into awf.Function. - breaking
.computeand.inspectno longer accept extra arguments that aren't required for the computation. If the object doesn't depend on anywf.parameters orwf.widgets, passing extra keyword arguments will raise an error. Similarly, not providing keyword arguments for all parameters the object depends on will raise an error. - breaking The
wf.XYZinterface has changed; construct an XYZ withwf.XYZ(...)instead ofwf.XYZ.build(...).save() - Set
days_to_expirationonXYZobjects. After this many days, the object is deleted. Jobmetadata is deleted after 10 days;wf.Job.get(...)on a job ID more than 10 days old will fail. Note that Job results have always been deleted after 10 days; now the metadata expires as well.wf.Functionhas better support for named arguments. Now,f = wf.Function[{'x': wf.Int, 'y': wf.Str}, wf.Int]requires two argumentsxandy, and they can be given positionally (f(1, "hi")), by name in any order(f(x=1, y="hi")orf(y="hi", x=1)), or both (f(1, y="hi")).wf.Function.from_callablewill generate a Function with the same names as the Python function you decorate or pass in. Therefore, when using@wf.publishas a decorator, the published Function will automatically have the same argument names as your Python function.
- Python 3.8 is now supported in the client.
- As Python 3.5 has reached End Of Life, it is no longer supported by the descarteslabs client.
- Altered the behavior of Task function creation. Deprecation warnings will be issued when attempting to create a Task function for which support will be removed in the near future. It is strongly recommended to test any existing code which uses the Tasks client when upgrading to this release.
- New tasks public images for for use with Python 3.8 are available.
.pick_bandssupports proxywf.Strobjects;.unpack_bandssupportswf.Strandwf.Tuple[wf.Str, ...].- Better performance constructing a
wf.Arrayfrom aListof numbers (likewf.Array(ic.sum(["pixels", "bands"]))) - No more error using
@wf.publishas a decorator on a function without a docstring
No more irrelevant DeprecationWarnings when importing the descarteslabs package (#235). Deprecated functionality in the package will now show FutureWarnings instead.
wf.map.geocontextdoesn't raise an error about the CRS of the mapwf.flowsdoesn't raise an error about versions from incompatible channels
- Example code has been cleaned up.
- Sharing of any Workflows object as a
Workflowwith version and access control. Browse through sharedWorkflows with thewf.flowsbrowser widget. - Upload images to the DL catalog from Workflows jobs. Usage details can be found in the docs.
wf.np.medianJob.cancel()to cancel running jobs.- Transient failures in Jobs are automatically retried, resulting in fewer errors.
- Search widget on
wf.mapby default.
- Bitwise operations on imagery no longer fail
wf.np.linspaceno longer fails when being called correctly.medianis slightly less prone to OOM errors
- Breaking: Workflows sharing:
wf.publish()andwf.use()have new signatures,wf.retrieve()has been removed in favor ofwf.Workflow.get()andwf.VersionedGraft.get_version()and thewf.Workflowobject has been completely refactored. Detailed information is in the docs. Array.to_imagerynow acceptsKnownDictfor bandinfo and properties.Numbers can now be constructed fromStrs
- Output formats for
.computeincluding GeoTIFF, JSON, PyArrow, and MessagePack. Usage details can be found in the docs. - Destinations for Job results: download and email. Usage details can be found in the docs.
- Save
.computeoutputs to a file with thefile=argument. - Pixel value inspector: click in the map widget to view pixel values.
wf.ifelsefor simple conditional logic.- NumPy functions including
hypot,bitwise_and,bitwise_or,bitwise_xor,bitwise_not,invert, andldexp - Bitwise
ArrayandMaskedArrayoperations sizeattribute onArrayandMaskedArrayastypefunction onArrayandMaskedArrayfor changing the dtypeflattenfunction onArrayandMaskedArrayfor flattening into a 1D arrayMaskedArray.compressedfor getting all unmasked data as a 1D arraygetfunction onDictandKnownDictfor providing a default value if a key does not existnbandsattribute onImageandImageCollectionproxifycan handlescenes.GeoContextsDict.contains,Dict.length
- Fewer failures and hanging calls when connecting to the Workflows backend (like
.compute,.visualize,Job.get, etc.) wf.numpy.histogramworks correctly with computed values forrangeandbins(such asrange=[arr.min(), arr.max()])- More consistent throughput when a large number of jobs are submitted
Arrays can now be constructed from proxyListsMaskedArray.filledworks correctly when passed Python values- Long-running sessions (like Jupyter kernels) refresh credentials instead of failing with auth errors after many hours of use
wf.numpy.dotandwf.numpy.einsumno longer fail when being called correctly- Occasional errors like
('array-89199362e9a5d598fb5c82805136834d', 0, 0)when callingwf.compute()with multiple values are resolved
pick_bandsaccepts duplicate band names. Enjoy easier Sentinel-1"vv vh vv"visualizations!ImageCollection.from_idis always ordered by datewf.numpy.percentileno longer accepts anaxisargument- breaking
wf.Jobconstruction and interface changes:- Use a single
wf.Job(..)call instead ofwf.Job.build(...).execute()to create and launch a Job - New
Job.result_to_filemethod Job.statusis removed in favor of a singleJob.stagewf.TimeoutErrorrenamed towf.JobTimeoutError
- Use a single
- 191 functions from NumPy are available for Workflows
Arrays, including parts of thenumpy.linalgandnumpy.masubmodules. See the full list on the docs. index_to_coordsandcoords_to_indexmethods onImage/ImageCollection/GeoContextfor converting between geospatial and array coordinatesvalue_atfunction onImageandImageCollectionfor extracting single pixel values at spatial coordinates.
- Using datetimes as parameters to
visualizebehaves correctly.
- Fixed a bug that prevented uploading ndarrays of type
uint8
- Array support for
argmin,argmax,any,all pick_bandssupports anallow_missingkwarg to drop band names that may be missing from the data without an error.wf.computesupports passing lists or tuples of items to compute at the same time. Passing multiple items towf.compute, rather than callingobj.computefor each separately, is usually faster.- Casting from
BooltoInt:wf.Int(True) - Experimental
.inspect()method for small computations during interactive use.
- [breaking] Array no longer uses type parameters: now you construct an Array with
wf.Array([1, 2, 3]), notwf.Array[wf.Int, 1]([1, 2, 3]). Remember, Array is an experimental API and will continue to make frequent breaking changes! - Workflows now reuses the same gRPC client by default---so repeated or parallel calls to
.compute, etc. will be faster. Calling.computewithin a thread pool will also be significantly more efficient.
wf.numpy.histogramcorrectly accepts aList[Float]as therangeargument
1.1.2 fixes a bug which caused Workflows map layers to behave erratically when changing colormaps.
1.1.1 fixes a packaging issue that caused import descarteslabs.workflows to fail.
It also makes NumPy an explicit dependency. NumPy was already a transitive dependency, so this shouldn't cause any changes.
You should NOT install version 1.1.0; 1.1.1 should be used instead in all circumstances.
Image.upload()now emits a deprecation warning if the image has acs_codeorprojectionproperty. The projection defined in the uploaded file is always used and applied to the resulting image in the Catalog.Image.upload_ndarray()now emits a deprecation warning if the image has both acs_codeand aprojectionproperty. Only one of them may be supplied, andcs_codeis given preference.
SceneCollection.download_mosaichas new default behavior formask_alphawherein thealphaband will be used as a mask by default if it is available for all scenes in the collection, even if it is not specified in the list of bands.
- Experimental Array API following the same syntax as NumPy arrays. It supports vectorized operations, broadcasting,
and multidimensional indexing.
ndarrayattribute ofImageandImageCollectionwill return aMaskedArray.- Over 60 NumPy ufuncs are now callable with Workflows
Array. - Includes other useful
Arrayfunctions likemin(),median(),transpose(),concatenate(),stack(),histogram(), andreshape().
ImageCollection.sortby_composite()for creating an argmin/argmax composite of anImageCollection.- Slicing of
List,Tuple,Str, andImageCollection. wf.rangefor generating a sequence of numbers between start and stop values.ImageCollectionGroupby.mosaic()for applyingImageCollection.mosaicto each group.wf.exp(),wf.square(),wf.log1p(),wf.arcsin(),wf.arccos(), andwf.arctan()Datetime.is_between()for checking if aDatetimefalls within a specified date rangeFeatureCollection.contains()- Container operations on
GeometryCollectionincluding:GeometryCollection.contains()GeometryCollection.sorted()GeometryCollection.map()GeometryCollection.filter()GeometryCollection.reduce()
ListandTuplecan now be compared with other instances of their type via__lt__(),__eq__()etc.List.__add__()andList.__mul__()for concatenating and duplicatingLists.
- Products without alpha band and
nodatavalue are rejected, instead of silently producing unwanted behavior. ImageCollection.concat_bandsnow throws a better error when trying to concatenate bands from anotherImageCollectionthat is not the same length.Anyis now promotable to all other types automatically.- Better error when trying to iterate over Proxytypes.
- Interactive map: calls to
visualizenow clear layer errors. - Interactive map: when setting scales, invalid values are highlighted in red.
- Interactive map: a scalebar is shown on the bottom-left by default.
ImageCollection.mosaic()now in "last-on-top" order, which matches with GDAL anddl.raster. Usemosaic(reverse=True)for the same ordering as in v1.0.0.
- Better errors when specifying invalid type parameters for Proxytypes that require them.
- Field access on
Feature,FeatureCollection,Geometry, andGeomeryCollectionno longer fails. - In
from_id, processing level 'cubespline' no longer fails.
| As of January 1st, 2020, the client library no longer supports Python 2. For more information, please contact [email protected]. For help with porting to Python 3, please visit https://docs.python.org/3/howto/pyporting.html. |
|---|
- There is an entirely new backend supporting asynchronous uploads of image files and ndarrays with
the catalog client. There are minor changes to the
ImageUploadclass (a neweventsfield has subsumederrors, and thejob_idfield has been removed) but the basic interface is unchanged so most code will keep functioning without any changes. - It is now possible to cancel image uploads.
- Errors messages are now easier to read.
- Many improvements to the documentation.
- You can now create or retrieve an existing object using the
get_or_createmethod. - Retrieving a
BandorImageby name is now possible by callingget_bandorget_imageon theProductinstance. You can also use the Product'snamed_idfunction to get a complete id for images and bands. - A new convenience function
make_valid_nameonImageandBandclasses will return a sanitized name without invalid characters. - A new property
ATTRIBUTESenumerates which attributes are available for a specific catalog object. - Trying to set an attribute that does not exist will now raise
AttributeError. update_related_objects_permissions()should no longer fail with a JSON serialization error.- Setting a read-only attribute will now raise an
AttributeValidationError. - Saving a new object while one with the same id already exists will now raise a
ConflictErrorinstead ofBadRequestError. - If a retrieved object has since been deleted from the catalog, saving any changes or trying to
reload it will now raise a
DeletedObjectError. - Resolution fields now accept string values such as "10m" or "0.008 degrees". If the value cannot
be parsed, an
AttributeValidationErrorwill be raised. - Changes to the
extra_propertiesattribute are now tracked correctly.
- This release no longer supports Python 2.
- This package is now distributed as a Python 3 wheel which will speed up installation.
- Handling of missing data via empty ImageCollections
ImageCollection.from_idreturns an empty ImageCollection if no data exist for the given time/place, rather than an errorImageCollection.filterreturns an empty ImageCollection if the predicate is False for every Image, rather than an errorImage.replace_empty_withandImageCollection.replace_empty_withfor explicitly filling in missing data- See the Workflows guide for more information
- Docstrings and examples on every class and function!
- Assigning new metadata to Image properties & bandinfo:
Image.with_properties(),Image.with_bandinfo() - Interactive map: colorbar legends on layers with colormaps (requires matplotlib)
Dict.from_pairs: construct a Dict from a sequence of key-value pairs- Map displays a fullscreen button by default ([breaking] if your code adds one, you'll now get two)
wf.concatfor concatentatingImageandImageCollectionobjectsImageCollection.concatnow acceptsImageobjects; newImage.concatacceptsImageorImageCollection
ImageCollection.mosaic()FeatureCollection.sorted(),FeatureCollection.length(),FeatureCollection.__reversed__()GeometryCollection.length(),GeometryCollection.__reversed__()
wf.zipnow supportsImageCollection,FeatureCollection,GeometryCollectionas well asListandStr- Get a GeoContext for the current bounds of the map in any resolution, shape, or CRS (including
"utm", which automatically picks the right UTM zone for you) withwf.map.geocontext. Also now returns a Scenes GeoContext for better introspection and use with Raster. - Better backend type-checking displays the possible arguments for most functions if called incorrectly
arr_shapeincluded when callingwf.GeoContext.compute()- More readable errors when communication with the backend fails
- Interactive map: layout handles being resized, for example setting
wf.map.layout.height = '1000px' Anyis no longer callable;Any.castencouragedremove_layerandclear_layersmoved fromwf.interactive.MapAppclass towf.interactive.Map(non-breaking change)- [possibly breaking] band renaming in binary operators only occurs when broadcasting:
red + redis justred, rather thanred_add_red.red + blueis stillred_add_blue. Code which depends on accessing bands by name may need to change.
wf.wherepropagates masks correctly, and handles metadata correctly with multi-band inputsprocessing_level="surface"actually returns surface-reflectance-processed imageryImageCollection.sorted()works properly- Viewing global-extent WGS84 images on the Workflows map no longer causes errors
Listproxytype no longer infinitely iterable in Python- Repeated use of
axis="bands"works correctly ImageCollection.from_imagescorrectly aligns the bands of the inputs- Numeric casting (
wf.Int(wf.Float(2.2))) works as expected - More descriptive error when constructing an invalid
wf.Datetime - Computing a single
Boolvalue derived from imagery works correctly
- Update workflows client channel
- Workflows map UI is more stable: errors and layers won't fill the screen
- Catalog client: Added an
update()method that allows you to update multiple attributes at once.
- Catalog client: Images and Bands no longer reload the Product after calling
save - Catalog client: Various attributes that are lists now correctly track changes when modifying them with list methods (e.g.
Product.owners.append("foo")) - Catalog client: Error messages generated by the server have a nicer format
- Catalog client: Fix a bug that caused waiting for tasks to never complete
- The minimum
numpyversion has been bumped to 1.17.14 for Python version > 3.5, which addresses a bug withscenes.display
.compute()is noticeably faster- Most of the Python string API is now available on
workflows.Str - Interactive map: more descriptive error when not logged in to iam.descarteslabs.com
- Passing the wrong types into functions causes more descriptive and reliable errors
RST_STREAMerrors when calling.compute()have been eliminatedImage/ImageCollection.count()is much faster.buffer()on vector types now works correctly- Calling
.compute()on aGeometryCollectionworks
- Catalog client: Added a
MaskBand.is_alphaattribute to declare alpha channel behavior for a band.
- The maximum number of
extra_propertiesallowed for Catalog objects has been increased from 10 to 50. - Fixed bug causing
SceneCollection.downloadto fail.
- When you call
.compute()on anImageorImageCollection, theGeoContextis included on the result object (ImageResult.geocontext,ImageCollectionResult.geocontext)
- Passing a Workflows
Timedeltaobject (instead of adatetime.timedelta) into functions expecting it now behaves correctly - Arguments to the reducer function for
reduceare now in the correct order
- A new catalog client in
descarteslabs.catalogmakes searching and managing products, bands and images easier. This client encompasses functionality previously split between thedescarteslabs.Metadataanddescarteslabs.Catalogclient, which are now deprecated. Learn how to use the new API in the Catalog guide. - Property filtering expressions such as used in
scenes.search()andFeatureCollection.filter()now support anin_()method.
SceneCollection.downloadpreviously always returned successfully even if one or more of the downloads failed. Now if any of the downloads fail, a RuntimeError is raised, which will detail which destination files failed and why.- Fixed a bug where geometries used with the Scenes client had coordinates with reduced precision.
- Interactive parameters: add parameters to map layers and interactively control them using widgets
- Spatial convolution with
wf.conv2d - Result containers have helpful
reprs when displayed DatetimeandTimedeltaare unpacked intodatetime.datetimeanddatetime.timedeltaobjects when computed.
- [breaking] Result containers moved to
descarteslabs/workflows/resultsand renamed, appending "Result" to disambiguate (e.g. ImageResult and ImageCollectionResult) - [breaking]
.bandsand.imagesattributes of ImageResult and ImageCollectionResult renamed.ndarray - [breaking] When
compute-ing anImageorImageCollection, the order ofbandinfois only correct for Python >= 3.6 - Interactive maps: coordinates are displayed in lat, lon order instead of lon, lat for easier copy-pasting
- Interactive maps: each layer now has an associated output that is populated when running autoscale and deleted when the layer is removed
- Interactive maps:
Image.visualizereturns aLayerobject, making it easier to adjustLayer.parametersor integrate with other widgets
- Composing operations onto imported Workflows no longer causes nondeterministic errors when computed
- Interactive maps:
remove_layerdoesn't cause an error - No more errors when creating a
wf.parameterforDatetimeand other complex types .whereno longer causes a backend error- Calling
wf.map.geocontext()when the map is not fully initialized raises an informative error - Operations on numbers computed from raster data (like
img_collection.mean(axis=None)) no longer fail when computed - Colormap succeeds when the Image contains only 1 value
Raster.stackmax_workersis limited to 25 workers, and will raise a warning and set the value to 25 if a value more than 25 is specified.
- Interactive maps:
clear_layersandremove_layermethods - ImageCollections:
reversedoperator - ImageCollections:
concatandsortedmethods - ImageCollections:
head,tail, andpartitionmethods for slicing - ImageCollections:
wheremethod for filtering by condition - ImageCollections
map_windowmethod for applying sliding windows - ImageCollections: Indexing into ImageCollections is supported (
imgs[1]) - [breaking] Statistics functions are now applied to named axes
- DateTime, Timedelta, Geocontext, Bool, and Geometry are now computable
- ImageCollectionGroupby ProxyObject for grouping ImageCollection by properties, and applying functions over groups
- ImageCollections:
groupbymethod parameterconstructor
- Interactive maps: autoscaling is now done in the background
- Tiles requests can now include parameters
medianis noticeably fastercountis no longer breaks colormapsmap,filter, andreduceare 2x faster in the "PREPARING" stage- Significantly better performance for functions that reference variables outside their scope, like
overall_comp = ndvi.mean(axis="images")
deltas = ndvi.map(lambda img: img - overall_comp)
- Full support for floor-division (
//) between Datetimes and Timedeltas (imgs.filter(lambda img: img.properties['date'] // wf.Timedelta(days=14))
- [breaking]
ImageCollection.one(in favor of indexing)
scenes.DLTile.assign(pad=...)method added to ease creation of a tile in all ways indentical except for the padding.
- The parameter
nbitshas been deprecated for catalog bands.
- New interactive map, with GUI controls for multiple layers, scaling, and colormaps.
- Colormaps for single-band images.
- Map interface displays errors that occur while the backend is rendering images.
- ImageCollection compositing no longer changes band names (
reddoes not becomered_mean, for example). .clip()and.scale()methods for Image/ImageCollection.- Support specifying raster resampler method.
- Support specifying raster processing level:
toa(top-of-atmosphere) orsurface[surface reflectance). - No more tiles 400s for missing data; missing/masked pixels can optionally be filled with a checkerboard pattern.
- Workflows
Image.concatrenamedImage.concat_bands. - Data are left in
data_rangevalues ifphysical_rangeis not set, instead of scaling to the range0..1. - Selecting the same band name twice (
img.pick_bands("vv vv")) properly raises an error. - Reduced
DeprecationWarnings in Python 3.7.
- Alpha Workflows API client has been added. Access to the Workflows backend is restricted; contact support for more information.
- Workflows support for Python 3 added in channel v0-5.
- Scenes API now supports band scaling and output type specification for rastering methods.
- Methods in the Metadata, Raster, and Vector service clients that accepted GeoJSON geometries now also accept Shapely geometries.
- Add support for user cython modules in tasks.
- Tasks webhook methods no longer require a
group_idif a webhook id is provided. catalog_idproperty on images is no longer supported by the API- Fix
scenes.displayhandling of single band masked arrays with scalar masks - Fix problems with incomplete
UploadTaskinstances returned byvectors.FeatureCollection.list_uploads
- Metadata, Catalog, and Scenes now support a new
storage_stateproperty for managing image metadata and filtering search results.storage_state="available"is the default for new images and indicates that the raster data for that scene is available on the Descartes Labs Platform.storage_state="remote"indicates that the raster data has not yet been processed and made available to client users. - The following additional colormaps are now supported for bands – 'cool', 'coolwarm', 'hot', 'bwr', 'gist_earth', 'terrain'. Find more details about the colormaps here.
Scene.ndarray,SceneCollection.stack, andSceneCollection.mosaicnow support passing a string as themask_alphaargument to allow users to specify an alternate band name to use for masking.- Scenes now supports a new
save_imagefunction that allows a user to save a visualization given a filename and extension. - Tasks now allows you to unambiguously get a function by group id using
get_function_by_id. - All Client APIs now accept a
retriesargument to override the default retry configuration. The default remains the same as the prior behavior, which is to attempt 3 retries on errors which can be retried.
- Bands of different but compatible types can now be rastered together in
Scene.ndarray()andScene.download()as well as across multiple scenes inSceneCollection.mosaic(),SceneCollection.stack()andSceneCollection.download(). The result will have the most general data type. - Vector client functions that accept a
geometryargument now support passing Shapely shapes in addition to GeoJSON.
- Removed deprecated method
Metadata.sources() FeatureCollection.filter(geometry)will now raise anInvalidQueryExceptionif you try to overwrite an existing geometry in the filter chain. You can only set the geometry once.
- Many old and obsolete examples were removed from the package.
Scene.ndarray,SceneCollection.stack, andSceneCollection.mosaicnow will automatically mask alpha if the alpha band is available in the relevant scene(s), and will setmask_alphatoFalseif the alpha band does not exist.FeatureCollection.add,FeatureCollection.upload,Vector.create_feature,Vector.create_features, andVector.upload_featuresall accept afix_geometrystring argument that determines how to handle certain problem geometries including those which do not follow counter-clockwise winding order (which is required by the GeoJSON spec but not many popular tools). Allowed values arereject(reject invalid geometries with an error),fix(correct invalid geometries if possible and use this corrected value when creating the feature), andaccept(the default) which will correct the geometry for internal use but retain the original geometry in the results.Vector.get_upload_resultsandVector.get_upload_resultnow accept apendingparameter to include pending uploads in the results. Such pending results will havestatus: PENDINGand, in lieu of a task id, theidattribute will contain the upload id as returned byVector.upload_featuresUploadTask.statusno longer blocks until the upload task is completed, but rather returns the current status of the upload job, which may bePENDING,RUNNING,SUCCESS, orFAILURE.- The
FutureTask.readyandUploadTask.readyproperty has been added to test whether the task has completed. A return value ofTruemeans that ifget_result(wait=True)were to be called, it would return without blocking. - You can now export features to a storage
datablob. To export from thevectorclient, useVector.export_product_from_query()with a storage key and an optional query. This returns the task id of the export task. You can ask for status usingVector.get_export_results()for all export tasks orVector.get_export_result()for a specific task by task id. - FeatureCollection has been extended with this functionality with a
FeatureCollection.export()method that takes a storage key. This operates on the filter chain that FeatureCollection represents, or the full product if there is no filter chain. It returns anExportTaskwhich behaves similar to theFutureTask. Catalog.upload_image()andCatalog.upload_ndarray()now will return anupload_idthat can be used to query the status of that upload usingCatalog.upload_result(). Note that the upload id is the image id and if you use identical image idsCatalog.upload_result()will only show the result of the most recent upload.
- Several typical kinds of non-conforming GeoJSON which previously caused errors can now be accepted or
fixed by the
FeatureCollectionandVectormethods for adding or uploading new vector geometries.
- Fixed issues with
Catalog.upload_ndarray()under Windows - Added header to client requests to better debug retries
- Improved error messages for Catalog client upload methods
- Tasks methods
create_function,create_or_get_function, andnew_groupnow have image as a required parameter - The
nameparameter is renamed toproduct_idinVector.create_product, andFeatureCollection.createandFeatureCollection.copy. The 'name' parameter is renamed tonew_product_idinVector.create_product_from_query. Usingnamewill continue to work, but will be removed completely in future versions. - The
nameparameter is no longer required, and is ignored forVector.replace_product,Vector.update_product,FeatureCollection.updateandFeatureCollection.replace. This parameter will be removed completely in future versions.
Metadata.paged_searchhas been added and essentially supports the original behavior ofMetadata.searchprior to release 0.16.0. This method should generally be avoided in favor ofMetadata.features(orMetadata.search).
- Fixed typo in
UploadTask.statuswhich caused exception when handling certain failure conditions FeatureCollection.uploadparametermax_errorswas not being passed to Vector client.- Ensure
cloudpickle==0.4.0is version used when creatingTasks. - Eliminate redundant queries from
FeatureCollection.list.
FeatureCollection.uploadandVector.upload_featuresnow accept an optionalmax_errorsparameter to control how many errors are acceptable before declaring an upload a failure.UploadTask(as returned byFeatureCollection.uploadandVector.list_uploads) now has added attributes to better identify what was processed and what errors occurred.Storagenow has added methodsset_fileandget_fileto allow for better uploading and downloading, respectively, of large files.Storageclass now has anexists()method that checks whether an object exists in storage at the location of a givenkeyand returns a boolean.Scenes.searchallowslimit=NoneFeatureCollection.delete_featuresadded to support deletingFeatures that match afilterFeatureCollection.delete_featuresandFeatureCollection.wait_for_copynow useAsyncJobto poll for asynchronous job completion.Vector.delete_features_from_queryandVector.get_delete_features_statusadded to support newFeatureCollectionandAsyncJobmethods.
- Fixed tasks bugs when including modules with relative paths in
sys.path
- Tasks now support passing modules, data and requirements along with the function code, allowing for a more complex and customized execution environment.
- Vector search query results now report their total number of results by means of the standard
len()function.
Metadata.searchno longer has a 10,000-item limit, and the number of items returned will be closer tolimit. This method no longer accepts thecontinuation_tokenparameter.
- Raster client can now handle arbitrarily large numbers of tiles generated from a shape using the new
iter_dltiles_from_shape()method which allows you to iterate over large numbers of tiles in a time- and memory-efficient manner. Similarly the existingdltiles_from_shape()method can now handle arbitrarily large numbers of tiles although it can be very slow. - Vector client
upload_features()can now upload contents of a stream (e.g.io.IOBasederivative such asio.StringIO) as well as the contents of a named file. - Vector FeatureCollection
add()method can now handle an arbitrary number of Features. Use of theupload_features()method is still encouraged for large collections. - Vector client now supports creating a new product from the results of a query against an existing product with the
create_product_from_query()method. This support is also accessible via the newFeatureCollection.copy()method. - XYZTile GeoContext class, helpful for rendering to web maps that use XYZ-style tiles in a spherical Mercator CRS.
- Tasks client FutureTask now instantiates a client if none provided (the default).
- Catalog client methods now properly handle
add_namespaceparameter. - Vector Feature now includes valid geojson type 'Feature'.
- Tasks client now raises new GroupTerminalException if a task group stops accepting tasks.
- General documentation fixes.
- Scenes and raster clients have a
processing_levelparameter that can be used to turn on surface reflectance processing for products that support it
scenes.GeoContext: better defaults andbounds_crsparameterboundsare no longer limited to WGS84, but can be expressed in anybounds_crs- New
Scene.default_ctxuses a Scene'sgeotransto more accurately determine aGeoContextthat will result in no warping of the original data, better handling sinusoidal and other non-rectilinear coordinate reference systems. - Important: the default GeoContexts will now return differently-sized rasters than before!
They will now be more accurate to the original, unwarped data, but if you were relying on the old defaults, you should now explicitly set the
boundstogeometry.bounds,bounds_crsto"EPSG:4326", andalign_pixelsto True.
Scene.coverageandSceneCollection.filter_coverageaccept any geometry-like object, not just aGeoContext.
FutureTaskinheritance changed fromdicttoobject.
- Can now specify a GPU parameter for tasks.
Vectors.uploadallows you to upload a JSON newline delimited file.Vectors.list_uploadsallows you to list all uploads for a vector product.UploadTaskcontains the information about an upload and is returned by both methods.
Vector.list_productsandVector.search_featuresgetquery_limitandpage_sizeparameters.
Vector.upload_featureshandles new response format.
- Vector client support for retrieving status information about upload jobs. Added methods
Vector.get_upload_resultsandVector.get_upload_result.
- Shapely is now a full requirement of this package. Note: Windows users should visit https://docs.descarteslabs.com/installation.html#windows-users for installation guidance.
- Reduced the number of retries for some failure types.
- Resolved intermittent
SceneCollection.stackbug that manifested asAttributeError: 'NoneType' object has no attribute 'coords'due to Shapely thread-unsafety. - Tracking system environment to improve installation and support of different systems.
- The vector service is now part of the public package. See
descarteslabs.vectorsanddescarteslabs.client.services.vector.
- Fixed SSL problems when copying clients to forked processes or sharing them among threads
- Removed extra keyword arguments from places client
- Added deprecation warnings for parameters that have been renamed in the Metadata client
- Scenes now exposes more parameters from raster and metadata
- Scenes
descarteslabs.scenes.searchwill take a python datetime object in addition to a string - Scenes will now allow Feature and FeatureCollection in addition to GeoJSON geometry types
- Fixed Scenes issue preventing access to products with multi-byte data but single-byte alpha bands
Scene.download,SceneCollection.download, andSceneCollection.download_mosaicmethods- Colormaps supported in
descarteslabs.scenes.display - Task namespaces are automatically created with the first task group
- Moved metadata property filtering to common
- Deprecated
create_or_get_functionin tasks - Renamed some examples
- Namespaced auth environment variables:
DESCARTESLABS_CLIENT_SECRETandDESCARTESLABS_CLIENT_ID.CLIENT_SECRETandCLIENT_IDwill continue to work. - Tasks runtime check for Python version.
- Documentation updates
- Example updates
- Scenes package
- More examples
- Deprecated
add_namespaceargument in catalog client (defaults toFalsenow, formerlyTrue)
- Added org to token scope
- Removed deprecated key usage
- Tasks service
- Patched bug in catalog service for py3
- Catalog service
- Storage service
- Switched to
start_datetimeargument pattern instead ofstart_date - Fixed minor regression with
descarteslabs.extclients - Deprecated token param for
Serviceclass
- Raster stack method
- Removed deprecated searching by
const_id - Removed deprecated raster band methods
- Deprecated
sat_idparameter for metadata searches - Changed documentation from readthedocs to https://docs.descarteslabs.com
- Dot notation access to dictionaries returned by services
- Reorganization into a client submodule
- Fix regression for
NotFoundError
- Reverted
descarteslabs.services.basetodescarteslabs.services.service
- Reorganization of services
- Places updated to v2 backend, provides units interface to statistics, which carries some backwards incompatibility.
- Places updated to v2 backend, provides units interface to statistics, which carries some backwards incompatibility.
- Blosc Support for raster array compression transport
- Scrolling support for large metadata searches
- Offset keyword argument in metadata.search has been deprecated. Please use the metadata.features for iterating over large search results
- Complex filtering expressions for image attributes
- Raise explicitly on 409 response
- Keep retrying token refresh until token fully expired
- Fixed race condition when creating
.descarteslabsdirectory
- Added ext namespace
- Metadata multi-get
- Fix OpenSSL install on OSX
- Automatic retry on 504
- Internal API refactoring / improvements for Auth
- Add raster bands methods to metadata service.
- Deprecate raster band methods.
- Add
require_bandsparam to derived bands search method.
- Test suite replaces original token when finished running script tests.
- Support for derived bands endpoints.
- Direct access to
const_idtoproducttranslation.
descarteslabsscripts on windows OS.
- Fix auth login
- Add metadata.bands and metadata.products search/get capabilities.
- Add bands/products descriptions
- Additional Placetypes
- Better error messages with timeouts
- Update to latest version of
requests
- Major refactor of metadata.search
- Introduction of "Products" through
Metadata.products() - metadata entries id now concatenate the product id and the old metadata keys. The original metadata keys are available through entry['key'].
- Additional sorting available.
- Introduction of "Products" through
- Search & Raster using DLTile Feature GeoJSON or key. Uses output bounds, resolution, and srs to ease searching and rasterizing imagery over tiles.
- Better Error messaging
- DLTile notebook
saveandoutfile_basenameinRaster.raster()
- Fix metadata.features
- Strict "requests" versions needed due to upstream instability.
- Fix python 3 command line compatibility
- API Change
descarteslabs,raster,metadatahave all been merged into 'descarteslabs'. 'descarteslabs login' is now 'descarteslabs auth login', 'raster'' is now 'descarteslabs raster', etc.
- A Changelog
- Testing around command-line scripts
- Searching with cloud_fraction = 0
- dltile API documentation
- Fix login bug
- Installation of "requests[security]" for python < 2.7.9
- Doctests
- Python 3 login bug
- Search by Fractions
- Initial release of client library