Skip to content

Commit

Permalink
[docs] document OPTIONS settings (#1285)
Browse files Browse the repository at this point in the history
  • Loading branch information
jschneier authored Sep 4, 2023
1 parent 19a15c2 commit ad36a9f
Show file tree
Hide file tree
Showing 8 changed files with 546 additions and 744 deletions.
462 changes: 151 additions & 311 deletions docs/backends/amazon-S3.rst

Large diffs are not rendered by default.

224 changes: 104 additions & 120 deletions docs/backends/azure.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,188 +4,172 @@ Azure Storage
A custom storage system for Django using Microsoft Azure Storage backend.


Notes
*****
Installation
------------

Be aware Azure file names have some extra restrictions. They can't:
Install Azure SDK::

- end with a dot (``.``) or slash (``/``)
- contain more than 256 slashes (``/``)
- be longer than 1024 characters
pip install django-storages[azure]

This is usually not an issue, since some file-systems won't
allow this anyway.
There's ``default_storage.get_name_max_len()`` method
to get the ``max_length`` allowed. This is useful
for form inputs. It usually returns
``1024 - len(azure_location_setting)``.
There's ``default_storage.get_valid_name(...)`` method
to clean up file names when migrating to Azure.
Configuration & Settings
------------------------

Gzipping for static files must be done through Azure CDN.
Django 4.2 changed the way file storage objects are configured. In particular, it made it easier to independently configure
storage backends and add additional ones. To configure multiple storage objects pre Django 4.2 required subclassing the backend
because the settings were global, now you pass them under the key ``OPTIONS``. For example, to save media files to Azure on Django
>= 4.2 you'd define::


Install
*******
STORAGES = {
"default": {
"BACKEND": "storages.backends.azure_storage.AzureStorage",
"OPTIONS": {
...your_options_here
},
},
}

Install Azure SDK::
On Django < 4.2 you'd instead define::

pip install django-storages[azure]
DEFAULT_FILE_STORAGE = "storages.backends.azure_storage.AzureStorage"

To put static files on Azure via ``collectstatic`` on Django >= 4.2 you'd include the ``staticfiles`` key (at the same level as
``default`` above inside of the ``STORAGES`` dictionary while on Django < 4.2 you'd instead define::

Private VS Public Access
************************
STATICFILES_STORAGE = "storages.backends.azure_storage.AzureStorage"

The ``AzureStorage`` allows a single container. The container may have either
public access or private access. When dealing with a private container, the
``AZURE_URL_EXPIRATION_SECS`` must be set to get temporary URLs.
The settings documented in the following sections include both the key for ``OPTIONS`` (and subclassing) as
well as the global value. Given the significant improvements provided by the new API, migration is strongly encouraged.

A common setup is having private media files and public static files,
since public files allow for better caching (i.e: no query-string within the URL).
Authentication Settings
~~~~~~~~~~~~~~~~~~~~~~~

One way to support this is having two backends, a regular ``AzureStorage``
with the private container and expiration setting set, and a custom
backend (i.e: a subclass of ``AzureStorage``) for the public container.
Several different methods of authentication are provided. In order of precedence they are:

Custom backend::
#. ``connection_string`` or ``AZURE_CONNECTION_STRING`` (see `Connection string docs <http://azure.microsoft.com/en-us/documentation/articles/storage-configure-connection-string/>`_)
#. (``account_key`` or ``AZURE_ACCOUNT_KEY``) and (``account_name`` or ``AZURE_ACCOUNT_NAME``)
#. ``token_credential`` or ``AZURE_TOKEN_CREDENTIAL``
#. ``sas_token`` or ``AZURE_SAS_TOKEN``

# file: ./custom_storage/custom_azure.py
class PublicAzureStorage(AzureStorage):
account_name = 'myaccount'
account_key = 'mykey'
azure_container = 'mypublic_container'
expiration_secs = None
Settings
~~~~~~~~

Then on settings set::
``azure_container`` or ``AZURE_CONTAINER``

# django < 4.2
DEFAULT_FILE_STORAGE = 'storages.backends.azure_storage.AzureStorage'
STATICFILES_STORAGE = 'custom_storage.custom_azure.PublicAzureStorage'
**Required**

# django >= 4.2
STORAGES = {
"default": {"BACKEND": "storages.backends.azure_storage.AzureStorage"},
"staticfiles": {"BACKEND": "custom_storage.custom_azure.PublicAzureStorage"},
}
This is where the files uploaded through Django will be uploaded.
The container must be already created, since the storage system will not attempt to create it.

+++++++++++++++++++++
Private VS Public URL
+++++++++++++++++++++
``azure_ssl`` or ``AZURE_SSL``

The difference between public and private URLs is that private includes the SAS token.
With private URLs you can override certain properties stored for the blob by specifying
query parameters as part of the shared access signature. These properties include the
cache-control, content-type, content-encoding, content-language, and content-disposition.
See https://docs.microsoft.com/en-us/rest/api/storageservices/set-blob-properties#remarks
Default: ``True``

You can specify these parameters by::
Set a secure connection (HTTPS), otherwise it makes an insecure connection (HTTP).

az_storage = AzureStorage()
az_url = az_storage.url(blob_name, parameters={'content_type': 'text/html;'})
``upload_max_conn`` or ``AZURE_UPLOAD_MAX_CONN``

Default: ``2``

Settings
********
Number of connections to make when uploading a single file.

The following settings should be set within the standard Django
configuration file, usually `settings.py`.
``timeout`` or ``AZURE_CONNECTION_TIMEOUT_SECS``

Set the default storage (i.e: for media files) and the static storage
(i.e: for static files) to use the Azure backend::
Default: ``20``

# django < 4.2
DEFAULT_FILE_STORAGE = 'storages.backends.azure_storage.AzureStorage'
STATICFILES_STORAGE = 'storages.backends.azure_storage.AzureStorage'
Global connection timeout in seconds.

# django >= 4.2
STORAGES = {
"default": {"BACKEND": "storages.backends.azure_storage.AzureStorage"},
"staticfiles": {"BACKEND": "storages.backends.azure_storage.AzureStorage"},
}
``max_memory`` size ``AZURE_BLOB_MAX_MEMORY_SIZE``

The following settings are available:
Default: ``2*1024*1024`` i.e ``2MB``

``AZURE_ACCOUNT_NAME``
Maximum memory used by a downloaded file before dumping it to disk in bytes.

This setting is the Windows Azure Storage Account name, which in many cases
is also the first part of the url for instance: http://azure_account_name.blob.core.windows.net/
would mean::
``expiration_secs`` or ``AZURE_URL_EXPIRATION_SECS``

AZURE_ACCOUNT_NAME = "azure_account_name"
Default: ``None``

``AZURE_ACCOUNT_KEY``
Seconds before a URL expires, set to ``None`` to never expire it.
Be aware the container must have public read permissions in order
to access a URL without expiration date.

This is the private key that gives Django access to the Windows Azure Account.
``overwrite_files`` or ``AZURE_OVERWRITE_FILES``

``AZURE_CONTAINER``
Default: ``False``

This is where the files uploaded through Django will be uploaded.
The container must be already created, since the storage system will not attempt to create it.
Whether or not to overwrite a file previously uploaded with the same name. If not, random character are appended.

``AZURE_SSL``
``location`` or ``AZURE_LOCATION``

Set a secure connection (HTTPS), otherwise it makes an insecure connection (HTTP). Default is ``True``
Default: ``''``

``AZURE_UPLOAD_MAX_CONN``
Default location for the uploaded files. This is a path that gets prepended to every file name.

Number of connections to make when uploading a single file. Default is ``2``
``endpoint_suffix`` or ``AZURE_ENDPOINT_SUFFIX``

``AZURE_CONNECTION_TIMEOUT_SECS``
Default: ``core.windows.net``

Global connection timeout in seconds. Default is ``20``
Use ``core.chinacloudapi.cn`` for azure.cn accounts.

``AZURE_BLOB_MAX_MEMORY_SIZE``
``custom_domain`` or ``AZURE_CUSTOM_DOMAIN``

Maximum memory used by a downloaded file before dumping it to disk. Unit is in bytes. Default is ``2MB``
Default: ``None``

``AZURE_URL_EXPIRATION_SECS``
The custom domain to use for generating URLs for files. For
example, ``www.mydomain.com`` or ``mycdn.azureedge.net``.

Seconds before a URL expires, set to ``None`` to never expire it.
Be aware the container must have public read permissions in order
to access a URL without expiration date. Default is ``None``
``AZURE_TOKEN_CREDENTIAL``

``AZURE_OVERWRITE_FILES``
A token credential used to authenticate HTTPS requests. The token value
should be updated before its expiration.

Overwrite an existing file when it has the same name as the file being uploaded.
Otherwise, rename it. Default is ``False``

``AZURE_LOCATION``
``cache_control`` or ``AZURE_CACHE_CONTROL``

Default location for the uploaded files. This is a path that gets prepended to every file name.
Default: ``None``

``AZURE_ENDPOINT_SUFFIX``
A variable to set the Cache-Control HTTP response header. E.g.::

Defaults to ``core.windows.net``. Use ``core.chinacloudapi.cn`` for azure.cn accounts.
cache_control: "public,max-age=31536000,immutable"

``AZURE_CUSTOM_DOMAIN``
``object_parameters`` or ``AZURE_OBJECT_PARAMETERS``

The custom domain to use for generating URLs for files. For
example, ``www.mydomain.com`` or ``mycdn.azureedge.net``.
Default: ``{}``

``AZURE_CONNECTION_STRING``
Use this to set content settings on all objects. To set these on a per-object
basis, subclass the backend and override ``AzureStorage.get_object_parameters``.

If specified, this will override all other parameters.
See http://azure.microsoft.com/en-us/documentation/articles/storage-configure-connection-string/
for the connection string format.
This is a Python ``dict`` and the possible parameters are: ``content_type``, ``content_encoding``, ``content_language``, ``content_disposition``, ``cache_control``, and ``content_md5``.

``AZURE_TOKEN_CREDENTIAL``
``api_version`` or ``AZURE_API_VERSION``

A token credential used to authenticate HTTPS requests. The token value
should be updated before its expiration.
Default: ``None``

The api version to use.

``AZURE_CACHE_CONTROL``

A variable to set the Cache-Control HTTP response header. E.g.
``AZURE_CACHE_CONTROL = "public,max-age=31536000,immutable"``
Additional Notes
----------------

``AZURE_OBJECT_PARAMETERS``
Filename Restrictions
~~~~~~~~~~~~~~~~~~~~~

Use this to set content settings on all objects. To set these on a per-object
basis, subclass the backend and override ``AzureStorage.get_object_parameters``.

This is a Python ``dict`` and the possible parameters are: ``content_type``, ``content_encoding``, ``content_language``, ``content_disposition``, ``cache_control``, and ``content_md5``.
Azure file names have some extra restrictions. They can't:

``AZURE_API_VERSION``
- end with a dot (``.``) or slash (``/``)
- contain more than 256 slashes (``/``)
- be longer than 1024 characters

The api version to use. The default value is ``None``.
Private vs Public URLs
~~~~~~~~~~~~~~~~~~~~~~

The difference between public and private URLs is that private includes the SAS token.
With private URLs you can override certain properties stored for the blob by specifying
query parameters as part of the shared access signature. These properties include the
cache-control, content-type, content-encoding, content-language, and content-disposition.
See https://docs.microsoft.com/en-us/rest/api/storageservices/set-blob-properties#remarks

You can specify these parameters by::

az_storage = AzureStorage()
az_url = az_storage.url(blob_name, parameters={'content_type': 'text/html;'})
6 changes: 3 additions & 3 deletions docs/backends/backblaze-B2.rst
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,6 @@ Backblaze B2 implements an `S3 Compatible API <https://www.backblaze.com/b2/docs
#. Create an `application key <https://www.backblaze.com/b2/docs/application_keys.html>`_. Best practice is to limit access to the bucket you just created.
#. Follow the instructions in the :doc:`Amazon S3 docs <amazon-S3>` with the following exceptions:

* Set ``AWS_S3_REGION_NAME`` to your Backblaze B2 region, for example, ``us-west-004``
* Set ``AWS_S3_ENDPOINT_URL`` to ``https://s3.${AWS_S3_REGION_NAME}.backblazeb2.com``
* Set the values of ``AWS_ACCESS_KEY_ID`` and ``AWS_SECRET_ACCESS_KEY`` to the application key id and application key you created in step 2.
* Set ``region_name`` to your Backblaze B2 region, for example, ``us-west-004``
* Set ``endpoint_url`` to ``https://s3.${AWS_S3_REGION_NAME}.backblazeb2.com``
* Set the values of ``access_key`` and ``secret_key`` to the application key id and application key you created in step 2.
6 changes: 3 additions & 3 deletions docs/backends/digital-ocean-spaces.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,6 @@ Digital Ocean

Digital Ocean Spaces implements the S3 protocol. To use it follow the instructions in the :doc:`Amazon S3 docs <amazon-S3>` with the important caveats that you must:

- Set ``AWS_S3_REGION_NAME`` to your Digital Ocean region (such as ``nyc3`` or ``sfo2``)
- Set ``AWS_S3_ENDPOINT_URL`` to the value of ``https://${AWS_S3_REGION_NAME}.digitaloceanspaces.com``
- Set the values of ``AWS_ACCESS_KEY_ID`` and ``AWS_SECRET_ACCESS_KEY`` to the corresponding values from Digital Ocean
- Set ``region_name`` to your Digital Ocean region (such as ``nyc3`` or ``sfo2``)
- Set ``endpoint_url`` to the value of ``https://${region_name}.digitaloceanspaces.com``
- Set the values of ``access_key`` and ``secret_key`` to the corresponding values from Digital Ocean
Loading

0 comments on commit ad36a9f

Please sign in to comment.