Skip to content

Data plane codegen testing

iscai-msft edited this page Feb 15, 2022 · 7 revisions

There are two distinct kinds of evolution for a data-plane generated SDK:

  • Developer driven evolution
  • Service driven evolution

The main difference between the two is Swagger status. Developer driven evolutions are not done because the Swagger is changing, but because we have reasons to think the surface API should be improved to provide a better customer experience. Tests in those scenarios will assume the Swagger is not changing, and there is only one Swagger version for the entire testsuite.

Service driven evolution on the other hand, means the service has a new API Version and is adding behaviors into the service API. Assuming the service team follow the Azure guidelines, those changes should not have breaking changes (it is accepted that breaking changes at the RestAPI layer will break the generation as a consequence). Service teams are encouraged to avoid breaking changes in RestAPI anyway, from the stewardship review board.

Changes done in a Swagger "in place" in a given API version are NOT considered to be on those categories, but are bug fixes of the Swagger, and should happen only if the service is still in preview. Swagger adjustments (like missing LRO annotations, adding a default client value, paging configuration, etc.) are only acceptable in preview releases, and the Swagger for a given API version will be frozen after the SDK from it is declared GA. We will NOT then consider a goal to verify we can adapt SDK to those scenarios, and we expect Swagger validation to be done during the various previews, reviews and SDK testing by the service teams before we GA.

While service teams will have requirements on the level of testing they have to do ship a generated data-plane SDK, we want to verify upstream that the codegen is designed in a way to enable the previous scenarios. In order to do that, the autorest.testserver will provide a set of Swagger scenarios that each language needs to implement to show they are prepared for the scenarios.

Scenarios

For the following scenarios, it is not asked that the codegen generate code that takes care of it automatically, but the requirement is that the design is flexible enough for an SDK writer to handcraft it in a "reasonable time". The hand crafting must be documented, and the size of it will determine if we consider the scenario fully supported by codegen or if more investement should be done.

Developer driven evolution

Swagger input: https://github.com/Azure/autorest.testserver/blob/main/swagger/dpg-customization.json

Example of scenarios:

  • Improve a GET method that return raw JSON to return a model
  • Improve a PUT polling method that return raw JSON to return a model
  • Improve a GET paging method that return raw JSON to return a model
  • Improve a POST method that reads a raw JSON to accept a model
  • Breaking the glass / escape hatch. Call an API that has no Swagger definition.

The full list can be found here: https://microsoft.sharepoint.com/:x:/t/AzureDeveloperExperience/EX8EMNJ1RyRIkqzrQUUbyoUBo6c_EW05tisOsdTtxxhqVw?e=CSYlhn

Service driven evolution

It is recommended to run those scenarios to generate the SDK with the initial file, and to generate a second time with the update file in place (like if the second file was added later as a new APIVersion). It's important to do it this way, since we are testing what happen when SDK are regenerating from a new Swagger.

Scenarios here are considered acceptable from an Azure breaking change policy

Example of scenarios:

  • A method gets a new optional parameter
  • A new method is added (new path)
  • A new method is added (path existed in the Swagger)
  • A new body type is added (was JSON, and now JSON + JPEG).

The full list can be found here: https://microsoft.sharepoint.com/:x:/t/AzureDeveloperExperience/EX8EMNJ1RyRIkqzrQUUbyoUBo6c_EW05tisOsdTtxxhqVw?e=CSYlhn

Clone this wiki locally