Skip to content

Allow extensions for token parameters #1241

@arner

Description

@arner

Problem statement

When developing new features or for project specific requirements, it is sometimes necessary to add new Validators. These validators can decide on the validity of a token transaction based on use case specific business logic.

It is possible that the validators need access to custom relatively static information, such as the presence of a signature of a specific role that is not in the Issuers or Auditors slices. Or a list of allowed token types for this namespace, or any other use case specific data. The most logical place to store that data would be the token parameters. At the moment that is only possible by forking and changing the serialization and deserialization logic (and possibly tokengen), which is not easy to do and introduces challenges with remaining compatible with the latest open source code.

Goal of the feature

The goal is to make it easier to extend the token sdk with custom business logic, by making the token params extensible.

  1. Open source tokengen can be used, regardless if you need custom data or not.
  2. Open source chaincode and validators remain forward and backward compatible with params that contain custom data.
  3. We keep the option of extending the structured protobuf in this repo (forward and backwards compatible by design).

It remains the responsibility of the application to serialize and deserialize the extended data (and version it if necessary), and also to ensure auditability of which code has been run to validate which transactions. That does not change and is out of scope of this feature. With this feature, at least the serialization of the params is not the bottleneck for compatibility.

Implementation

  • token/core/zkatdlog/nogh/protos/noghpp.proto and the fabtoken equivalent get a field map<string, bytes> extensions.
  • pp.PublicParameters gets an equivalent Extensions map[string][]byte.
  • tokengen generate can be run with --extra foo=foo.json --extra bar=mydata.bin which will serialize the key/value pairs (value being the bytes of the file). It should also be implemented for tokengen update (exact logic to be determined).

Alternative would be to require the extra data to be in a user supplied proto format. It creates stronger typing but at the cost of complexity. For now the free form would be a good balance, we can always add a structured extensions field if the need arises.

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions