Skip to content

Commit 6438fce

Browse files
authored
Rename master -> main (triton-inference-server#2913)
* Rename master -> main * Fix master -> main across the documentation
1 parent c8fbe2b commit 6438fce

File tree

4 files changed

+5
-5
lines changed

4 files changed

+5
-5
lines changed

README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@
3030

3131
# Triton Inference Server
3232

33-
**LATEST RELEASE: You are currently on the master branch which tracks
33+
**LATEST RELEASE: You are currently on the main branch which tracks
3434
under-development progress towards the next release. The latest
3535
release of the Triton Inference Server is 2.10.0 and is available on
3636
branch

docs/build.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -69,7 +69,7 @@ Triton and so does not appear in /opt/tritonserver/backends).
6969
The first step for any build is to checkout the
7070
[triton-inference-server/server](https://github.com/triton-inference-server/server)
7171
repo branch for the release you are interested in building (or the
72-
master branch to build from the development branch). Then run build.py
72+
master/main branch to build from the development branch). Then run build.py
7373
as described below. The build.py script performs these steps when
7474
building with Docker.
7575

@@ -129,7 +129,7 @@ without Docker.
129129
The first step for any build is to checkout the
130130
[triton-inference-server/server](https://github.com/triton-inference-server/server)
131131
repo branch for the release you are interested in building (or the
132-
master branch to build from the development branch). Then run build.py
132+
master/main branch to build from the development branch). Then run build.py
133133
as described below. The build.py script will perform the following
134134
steps (note that if you are building with Docker that these same steps
135135
will be performed during the Docker build within the

docs/inference_protocols.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,7 @@ protocols](https://github.com/kubeflow/kfserving/tree/master/docs/predict-api/v2
3939
that have been proposed by the [KFServing
4040
project](https://github.com/kubeflow/kfserving). To fully enable all
4141
capabilities Triton also implements a number [HTTP/REST and GRPC
42-
extensions](https://github.com/triton-inference-server/server/tree/master/docs/protocol).
42+
extensions](https://github.com/triton-inference-server/server/tree/main/docs/protocol).
4343
to the KFServing inference protocol.
4444

4545
The HTTP/REST and GRPC protcols provide endpoints to check server and

docs/model_configuration.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -221,7 +221,7 @@ configuration file.
221221
When using --strict-model-config=false you can see the model
222222
configuration that was generated for a model by using the [model
223223
configuration
224-
endpoint](https://github.com/triton-inference-server/server/blob/master/docs/protocol/extension_model_configuration.md). The
224+
endpoint](https://github.com/triton-inference-server/server/blob/main/docs/protocol/extension_model_configuration.md). The
225225
easiest way to do this is to use a utility like *curl*:
226226

227227
```bash

0 commit comments

Comments
 (0)