Releases: openlayer-ai/openlayer-python
Releases · openlayer-ai/openlayer-python
v0.0.0a6
Pin versions of requests, urllib3
v0.0.0a5
Fixes OPEN-3628 Conda not available in subprocess calls when running in Docker container
v0.0.0a4
Bump version
v0.0.0a3
Fix aws storage type
v0.0.0a2
Update StorageType enum
v0.0.0a01
Shift back to PyPI because setuptools is not in test PyPI
v0.3.0
Added
- A
Projecthelper class. - A convenience method
create_or_load_projectwhich loads in a project if it is already created. - Accepts AZURE as a
DeploymentType.
Changed
- Compatibility with Unbox API OpenAPI refactor.
- Models and datasets must be added to projects.
- Deprecates
categorical_features_mapin favor ofcategorical_feature_namesfor model and dataset uploads. - Moved
TaskTypeattribute from theModellevel to theProjectlevel. Creating aProjectnow requires specifying theTaskType. - Removed
namefromadd_dataset. - Changed
descriptiontocommit_messagefromadd_dataset,add_dataframeandadd_model. requirements_txt_fileno longer optional for model uploads.- NLP dataset character limit is now 1000 characters.
Fixed
- More comprehensive model and dataset upload validation.
- Bug with duplicate feature names for NLP datasets if uploading same dataset twice.
- Added
protobuf==3.2.0to requirements to fix bug with model deployment.
0.3.0rc1
Added
- A
Projecthelper class. - A convenience method
create_or_load_projectwhich loads in a project if it is already created. - Accepts AZURE as a
DeploymentType.
Changed
- Compatibility with Unbox API OpenAPI refactor.
- Models and datasets must be added to projects.
- Deprecates
categorical_features_mapin favor ofcategorical_feature_namesfor model and dataset uploads. - Moved
TaskTypeattribute from theModellevel to theProjectlevel. Creating aProjectnow requires specifying theTaskType. - Removed
namefromadd_dataset. - Changed
descriptiontocommit_messagefromadd_dataset,add_dataframeandadd_model. requirements_txt_fileno longer optional for model uploads.- NLP dataset character limit is now 1000 characters.
Fixed
- More comprehensive model and dataset upload validation.
- Bug with duplicate feature names for NLP datasets if uploading same dataset twice.
- Added
protobuf==3.2.0to requirements to fix bug with model deployment.
0.3.0a5
Changed
- NLP dataset character limit to 1000 characters.
Fixed
- Fix issue with duplicate feature names for NLP datasets.
v0.3.0a4
Fixed
explainability_tokenizervalidation.