Skip to content

Commit dd48d05

Browse files
authored
[FLINK-24018][build] Remove Scala dependencies from Java APIs
1 parent 055c8c8 commit dd48d05

File tree

153 files changed

+426
-439
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

153 files changed

+426
-439
lines changed

docs/README.md

+3-3
Original file line numberDiff line numberDiff line change
@@ -103,14 +103,14 @@ to its documentation markdown. The following are available for use:
103103

104104
#### Flink Artifact
105105

106-
{{< artifact flink-streaming-java withScalaVersion >}}
106+
{{< artifact flink-streaming-scala withScalaVersion >}}
107107

108-
This will be replaced by the maven artifact for flink-streaming-java that users should copy into their pom.xml file. It will render out to:
108+
This will be replaced by the maven artifact for flink-streaming-scala that users should copy into their pom.xml file. It will render out to:
109109

110110
```xml
111111
<dependency>
112112
<groupId>org.apache.flink</groupId>
113-
<artifactId>flink-streaming-java_2.12</artifactId>
113+
<artifactId>flink-streaming-scala_2.12</artifactId>
114114
<version><!-- current flink version --></version>
115115
</dependency>
116116
```

docs/content.zh/docs/connectors/datastream/elasticsearch.md

+3-3
Original file line numberDiff line numberDiff line change
@@ -42,15 +42,15 @@ under the License.
4242
<tbody>
4343
<tr>
4444
<td>5.x</td>
45-
<td>{{< artifact flink-connector-elasticsearch5 withScalaVersion >}}</td>
45+
<td>{{< artifact flink-connector-elasticsearch5 >}}</td>
4646
</tr>
4747
<tr>
4848
<td>6.x</td>
49-
<td>{{< artifact flink-connector-elasticsearch6 withScalaVersion >}}</td>
49+
<td>{{< artifact flink-connector-elasticsearch6 >}}</td>
5050
</tr>
5151
<tr>
5252
<td>7 及更高版本</td>
53-
<td>{{< artifact flink-connector-elasticsearch7 withScalaVersion >}}</td>
53+
<td>{{< artifact flink-connector-elasticsearch7 >}}</td>
5454
</tr>
5555
</tbody>
5656
</table>

docs/content.zh/docs/connectors/datastream/jdbc.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@ under the License.
3030

3131
添加下面的依赖以便使用该连接器(同时添加 JDBC 驱动):
3232

33-
{{< artifact flink-connector-jdbc withScalaVersion >}}
33+
{{< artifact flink-connector-jdbc >}}
3434

3535
注意该连接器目前还 __不是__ 二进制发行版的一部分,如何在集群中运行请参考 [这里]({{< ref "docs/dev/datastream/project-configuration" >}})。
3636

docs/content.zh/docs/connectors/datastream/kafka.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -36,7 +36,7 @@ Apache Flink 集成了通用的 Kafka 连接器,它会尽力与 Kafka client
3636
当前 Kafka client 向后兼容 0.10.0 或更高版本的 Kafka broker。
3737
有关 Kafka 兼容性的更多细节,请参考 [Kafka 官方文档](https://kafka.apache.org/protocol.html#protocol_compatibility)
3838

39-
{{< artifact flink-connector-kafka withScalaVersion >}}
39+
{{< artifact flink-connector-kafka >}}
4040

4141
如果使用 Kafka source,```flink-connector-base``` 也需要包含在依赖中:
4242

docs/content.zh/docs/connectors/datastream/kinesis.md

+3-3
Original file line numberDiff line numberDiff line change
@@ -31,14 +31,14 @@ The Kinesis connector provides access to [Amazon AWS Kinesis Streams](http://aws
3131

3232
To use the connector, add the following Maven dependency to your project:
3333

34-
{{< artifact flink-connector-kinesis withScalaVersion >}}
34+
{{< artifact flink-connector-kinesis >}}
3535

3636
{{< hint warning >}}
37-
**Attention** Prior to Flink version 1.10.0 the `flink-connector-kinesis{{< scala_version >}}` has a dependency on code licensed under the [Amazon Software License](https://aws.amazon.com/asl/).
37+
**Attention** Prior to Flink version 1.10.0 the `flink-connector-kinesis` has a dependency on code licensed under the [Amazon Software License](https://aws.amazon.com/asl/).
3838
Linking to the prior versions of flink-connector-kinesis will include this code into your application.
3939
{{< /hint >}}
4040

41-
Due to the licensing issue, the `flink-connector-kinesis{{< scala_version >}}` artifact is not deployed to Maven central for the prior versions. Please see the version specific documentation for further information.
41+
Due to the licensing issue, the `flink-connector-kinesis` artifact is not deployed to Maven central for the prior versions. Please see the version specific documentation for further information.
4242

4343
## Using the Amazon Kinesis Streams Service
4444
Follow the instructions from the [Amazon Kinesis Streams Developer Guide](https://docs.aws.amazon.com/streams/latest/dev/learning-kinesis-module-one-create-stream.html)

docs/content.zh/docs/connectors/datastream/nifi.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ under the License.
2929
[Apache NiFi](https://nifi.apache.org/) 连接器提供了可以读取和写入的 Source 和 Sink。
3030
使用这个连接器,需要在工程中添加下面的依赖:
3131

32-
{{< artifact flink-connector-nifi withScalaVersion >}}
32+
{{< artifact flink-connector-nifi >}}
3333

3434
注意这些连接器目前还没有包含在二进制发行版中。添加依赖、打包配置以及集群运行的相关信息请参考 [这里]({{< ref "docs/dev/datastream/project-configuration" >}})。
3535

docs/content.zh/docs/connectors/datastream/pubsub.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ under the License.
2828

2929
这个连接器可向 [Google Cloud PubSub](https://cloud.google.com/pubsub) 读取与写入数据。添加下面的依赖来使用此连接器:
3030

31-
{{< artifact flink-connector-pubsub withScalaVersion >}}
31+
{{< artifact flink-connector-pubsub >}}
3232

3333
<p style="border-radius: 5px; padding: 5px" class="bg-danger">
3434
<b>注意</b>:此连接器最近才加到 Flink 里,还未接受广泛测试。

docs/content.zh/docs/connectors/datastream/pulsar.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -33,7 +33,7 @@ Flink 当前只提供 [Apache Pulsar](https://pulsar.apache.org) 数据源,用
3333

3434
如果想要了解更多关于 Pulsar API 兼容性设计,可以阅读文档 [PIP-72](https://github.com/apache/pulsar/wiki/PIP-72%3A-Introduce-Pulsar-Interface-Taxonomy%3A-Audience-and-Stability-Classification)
3535

36-
{{< artifact flink-connector-pulsar withScalaVersion >}}
36+
{{< artifact flink-connector-pulsar >}}
3737

3838
Flink 的流连接器并不会放到发行文件里面一同发布,阅读[此文档]({{< ref "docs/dev/datastream/project-configuration" >}}),了解如何将连接器添加到集群实例内。
3939

docs/content.zh/docs/connectors/datastream/rabbitmq.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -38,7 +38,7 @@ Flink 自身既没有复用 "RabbitMQ AMQP Java Client" 的代码,也没有将
3838

3939
这个连接器可以访问 [RabbitMQ](http://www.rabbitmq.com/) 的数据流。使用这个连接器,需要在工程里添加下面的依赖:
4040

41-
{{< artifact flink-connector-rabbitmq withScalaVersion >}}
41+
{{< artifact flink-connector-rabbitmq >}}
4242

4343
注意连接器现在没有包含在二进制发行版中。集群执行的相关信息请参考 [这里]({{< ref "docs/dev/datastream/project-configuration" >}}).
4444

docs/content.zh/docs/connectors/datastream/twitter.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@ under the License.
3030
Flink Streaming 通过一个内置的 `TwitterSource` 类来创建到 tweets 流的连接。
3131
使用 Twitter 连接器,需要在工程中添加下面的依赖:
3232

33-
{{< artifact flink-connector-twitter withScalaVersion >}}
33+
{{< artifact flink-connector-twitter >}}
3434

3535
注意:当前的二进制发行版还没有这些连接器。集群执行请参考[这里]({{< ref "docs/dev/datastream/project-configuration" >}}).
3636

docs/content.zh/docs/dev/dataset/cluster_execution.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -53,7 +53,7 @@ Flink 程序可以分布式运行在多机器集群上。有两种方式可以
5353
```xml
5454
<dependency>
5555
<groupId>org.apache.flink</groupId>
56-
<artifactId>flink-clients{{< scala_version >}}</artifactId>
56+
<artifactId>flink-clients</artifactId>
5757
<version>{{< version >}}</version>
5858
</dependency>
5959
```

docs/content.zh/docs/dev/dataset/local_execution.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -46,7 +46,7 @@ If you are developing your program in a Maven project, you have to add the `flin
4646
```xml
4747
<dependency>
4848
<groupId>org.apache.flink</groupId>
49-
<artifactId>flink-clients{{< scala_version >}}</artifactId>
49+
<artifactId>flink-clients</artifactId>
5050
<version>{{< version >}}</version>
5151
</dependency>
5252
```

docs/content.zh/docs/dev/datastream/fault-tolerance/queryable_state.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -55,7 +55,7 @@ under the License.
5555

5656
为了在 Flink 集群上使用 queryable state,需要进行以下操作:
5757

58-
1.`flink-queryable-state-runtime{{< scala_version >}}-{{< version >}}.jar`
58+
1.`flink-queryable-state-runtime-{{< version >}}.jar`
5959
[Flink distribution]({{< downloads >}} "Apache Flink: Downloads") 的 `opt/` 目录拷贝到 `lib/` 目录;
6060
2. 将参数 `queryable-state.enable` 设置为 `true`。详细信息以及其它配置可参考文档 [Configuration]({{< ref "docs/deployment/config" >}}#queryable-state)。
6161

docs/content.zh/docs/dev/datastream/project-configuration.md

+4-4
Original file line numberDiff line numberDiff line change
@@ -81,7 +81,7 @@ When setting up a project manually, you need to add the following dependencies f
8181
```xml
8282
<dependency>
8383
<groupId>org.apache.flink</groupId>
84-
<artifactId>flink-streaming-java{{< scala_version >}}</artifactId>
84+
<artifactId>flink-streaming-java</artifactId>
8585
<version>{{< version >}}</version>
8686
<scope>provided</scope>
8787
</dependency>
@@ -124,7 +124,7 @@ Below is an example adding the connector for Kafka as a dependency (Maven syntax
124124
```xml
125125
<dependency>
126126
<groupId>org.apache.flink</groupId>
127-
<artifactId>flink-connector-kafka{{< scala_version >}}</artifactId>
127+
<artifactId>flink-connector-kafka</artifactId>
128128
<version>{{< version >}}</version>
129129
</dependency>
130130
```
@@ -373,13 +373,13 @@ dependencies {
373373
// Compile-time dependencies that should NOT be part of the
374374
// shadow jar and are provided in the lib folder of Flink
375375
// --------------------------------------------------------------
376-
compile "org.apache.flink:flink-streaming-java_${scalaBinaryVersion}:${flinkVersion}"
376+
compile "org.apache.flink:flink-streaming-java:${flinkVersion}"
377377
378378
// --------------------------------------------------------------
379379
// Dependencies that should be part of the shadow jar, e.g.
380380
// connectors. These must be in the flinkShadowJar configuration!
381381
// --------------------------------------------------------------
382-
//flinkShadowJar "org.apache.flink:flink-connector-kafka_${scalaBinaryVersion}:${flinkVersion}"
382+
//flinkShadowJar "org.apache.flink:flink-connector-kafka:${flinkVersion}"
383383
384384
compile "org.apache.logging.log4j:log4j-api:${log4jVersion}"
385385
compile "org.apache.logging.log4j:log4j-core:${log4jVersion}"

docs/content.zh/docs/dev/datastream/testing.md

+3-3
Original file line numberDiff line numberDiff line change
@@ -153,9 +153,9 @@ class IncrementFlatMapFunctionTest extends FlatSpec with MockFactory {
153153

154154
要使用测试工具,还需要一组其他的依赖项(测试范围)。
155155

156-
{{< artifact flink-test-utils withScalaVersion withTestScope >}}
156+
{{< artifact flink-test-utils withTestScope >}}
157157
{{< artifact flink-runtime withTestScope >}}
158-
{{< artifact flink-streaming-java withScalaVersion withTestScope withTestClassifier >}}
158+
{{< artifact flink-streaming-java withTestScope withTestClassifier >}}
159159

160160
现在,可以使用测试工具将记录和 watermark 推送到用户自定义函数或自定义算子中,控制处理时间,最后对算子的输出(包括旁路输出)进行校验。
161161

@@ -401,7 +401,7 @@ Apache Flink 提供了一个名为 `MiniClusterWithClientResource` 的 Junit 规
401401

402402
要使用 `MiniClusterWithClientResource`,需要添加一个额外的依赖项(测试范围)。
403403

404-
{{< artifact flink-test-utils withScalaVersion withTestScope >}}
404+
{{< artifact flink-test-utils withTestScope >}}
405405

406406
让我们采用与前面几节相同的简单 `MapFunction`来做示例。
407407

docs/content.zh/docs/dev/table/sql/queries/match_recognize.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -79,7 +79,7 @@ Flink 的 `MATCH_RECOGNIZE` 子句实现是一个完整标准子集。仅支持
7979
```xml
8080
<dependency>
8181
<groupId>org.apache.flink</groupId>
82-
<artifactId>flink-cep{{< scala_version >}}</artifactId>
82+
<artifactId>flink-cep</artifactId>
8383
<version>{{< version >}}</version>
8484
</dependency>
8585
```

docs/content.zh/docs/libs/cep.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -43,7 +43,7 @@ FlinkCEP是在Flink上层实现的复杂事件处理库。
4343

4444
{{< tabs "722d55a5-7f12-4bcc-b080-b28d5e8860ac" >}}
4545
{{< tab "Java" >}}
46-
{{< artifact flink-cep withScalaVersion >}}
46+
{{< artifact flink-cep >}}
4747
{{< /tab >}}
4848
{{< tab "Scala" >}}
4949
{{< artifact flink-cep-scala withScalaVersion >}}

docs/content.zh/docs/libs/gelly/overview.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@ Add the following dependency to your `pom.xml` to use Gelly.
4545

4646
{{< tabs "96de5128-3c66-4942-9498-e9a8ae439314" >}}
4747
{{< tab "Java" >}}
48-
{{< artifact flink-gelly withScalaVersion >}}
48+
{{< artifact flink-gelly >}}
4949
{{< /tab >}}
5050
{{< tab "Scala" >}}
5151
{{< artifact flink-gelly-scala withScalaVersion >}}

docs/content.zh/docs/libs/state_processor_api.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,7 @@ For example, you can now arbitrarily modify the data types of states, adjust the
3737

3838
To get started with the state processor api, include the following library in your application.
3939

40-
{{< artifact flink-state-processor-api withScalaVersion >}}
40+
{{< artifact flink-state-processor-api >}}
4141

4242
## Mapping Application State to DataSets
4343

docs/content.zh/docs/ops/state/state_backends.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -128,7 +128,7 @@ env.setStateBackend(new HashMapStateBackend())
128128
```xml
129129
<dependency>
130130
<groupId>org.apache.flink</groupId>
131-
<artifactId>flink-statebackend-rocksdb{{< scala_version >}}</artifactId>
131+
<artifactId>flink-statebackend-rocksdb</artifactId>
132132
<version>{{< version >}}</version>
133133
<scope>provided</scope>
134134
</dependency>

docs/content/docs/connectors/datastream/elasticsearch.md

+3-3
Original file line numberDiff line numberDiff line change
@@ -44,15 +44,15 @@ of the Elasticsearch installation:
4444
<tbody>
4545
<tr>
4646
<td>5.x</td>
47-
<td>{{< artifact flink-connector-elasticsearch5 withScalaVersion >}}</td>
47+
<td>{{< artifact flink-connector-elasticsearch5 >}}</td>
4848
</tr>
4949
<tr>
5050
<td>6.x</td>
51-
<td>{{< artifact flink-connector-elasticsearch6 withScalaVersion >}}</td>
51+
<td>{{< artifact flink-connector-elasticsearch6 >}}</td>
5252
</tr>
5353
<tr>
5454
<td>7 and later versions</td>
55-
<td>{{< artifact flink-connector-elasticsearch7 withScalaVersion >}}</td>
55+
<td>{{< artifact flink-connector-elasticsearch7 >}}</td>
5656
</tr>
5757
</tbody>
5858
</table>

docs/content/docs/connectors/datastream/jdbc.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@ This connector provides a sink that writes data to a JDBC database.
3030

3131
To use it, add the following dependency to your project (along with your JDBC driver):
3232

33-
{{< artifact flink-connector-jdbc withScalaVersion >}}
33+
{{< artifact flink-connector-jdbc >}}
3434

3535
Note that the streaming connectors are currently __NOT__ part of the binary distribution. See how to link with them for cluster execution [here]({{< ref "docs/dev/datastream/project-configuration" >}}).
3636
A driver dependency is also required to connect to a specified database. Please consult your database documentation on how to add the corresponding driver.

docs/content/docs/connectors/datastream/kafka.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -36,7 +36,7 @@ The version of the client it uses may change between Flink releases.
3636
Modern Kafka clients are backwards compatible with broker versions 0.10.0 or later.
3737
For details on Kafka compatibility, please refer to the official [Kafka documentation](https://kafka.apache.org/protocol.html#protocol_compatibility).
3838

39-
{{< artifact flink-connector-kafka withScalaVersion >}}
39+
{{< artifact flink-connector-kafka >}}
4040

4141
Flink's streaming connectors are not currently part of the binary distribution.
4242
See how to link with them for cluster execution [here]({{< ref "docs/dev/datastream/project-configuration" >}}).

docs/content/docs/connectors/datastream/kinesis.md

+3-3
Original file line numberDiff line numberDiff line change
@@ -31,14 +31,14 @@ The Kinesis connector provides access to [Amazon AWS Kinesis Streams](http://aws
3131

3232
To use the connector, add the following Maven dependency to your project:
3333

34-
{{< artifact flink-connector-kinesis withScalaVersion >}}
34+
{{< artifact flink-connector-kinesis >}}
3535

3636
{{< hint warning >}}
37-
**Attention** Prior to Flink version 1.10.0 the `flink-connector-kinesis{{< scala_version >}}` has a dependency on code licensed under the [Amazon Software License](https://aws.amazon.com/asl/).
37+
**Attention** Prior to Flink version 1.10.0 the `flink-connector-kinesis` has a dependency on code licensed under the [Amazon Software License](https://aws.amazon.com/asl/).
3838
Linking to the prior versions of flink-connector-kinesis will include this code into your application.
3939
{{< /hint >}}
4040

41-
Due to the licensing issue, the `flink-connector-kinesis{{< scala_version >}}` artifact is not deployed to Maven central for the prior versions. Please see the version specific documentation for further information.
41+
Due to the licensing issue, the `flink-connector-kinesis` artifact is not deployed to Maven central for the prior versions. Please see the version specific documentation for further information.
4242

4343
## Using the Amazon Kinesis Streams Service
4444
Follow the instructions from the [Amazon Kinesis Streams Developer Guide](https://docs.aws.amazon.com/streams/latest/dev/learning-kinesis-module-one-create-stream.html)

docs/content/docs/connectors/datastream/nifi.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@ This connector provides a Source and Sink that can read from and write to
3030
[Apache NiFi](https://nifi.apache.org/). To use this connector, add the
3131
following dependency to your project:
3232

33-
{{< artifact flink-connector-nifi withScalaVersion >}}
33+
{{< artifact flink-connector-nifi >}}
3434

3535
Note that the streaming connectors are currently not part of the binary
3636
distribution. See

docs/content/docs/connectors/datastream/pubsub.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@ This connector provides a Source and Sink that can read from and write to
3030
[Google Cloud PubSub](https://cloud.google.com/pubsub). To use this connector, add the
3131
following dependency to your project:
3232

33-
{{< artifact flink-connector-gcp-pubsub withScalaVersion >}}
33+
{{< artifact flink-connector-gcp-pubsub >}}
3434

3535
{{< hint warning >}}
3636
<b>Note</b>: This connector has been added to Flink recently. It has not received widespread testing yet.

docs/content/docs/connectors/datastream/pulsar.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -33,7 +33,7 @@ Pulsar [transactions](https://pulsar.apache.org/docs/en/txn-what/),
3333
it is recommended to use Pulsar 2.8.0 or higher releases.
3434
For details on Pulsar compatibility, please refer to the [PIP-72](https://github.com/apache/pulsar/wiki/PIP-72%3A-Introduce-Pulsar-Interface-Taxonomy%3A-Audience-and-Stability-Classification).
3535

36-
{{< artifact flink-connector-pulsar withScalaVersion >}}
36+
{{< artifact flink-connector-pulsar >}}
3737

3838
Flink's streaming connectors are not currently part of the binary distribution.
3939
See how to link with them for cluster execution [here]({{< ref "docs/dev/datastream/project-configuration" >}}).

docs/content/docs/connectors/datastream/rabbitmq.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -42,7 +42,7 @@ must be aware that this may be subject to conditions declared in the Mozilla Pub
4242

4343
This connector provides access to data streams from [RabbitMQ](http://www.rabbitmq.com/). To use this connector, add the following dependency to your project:
4444

45-
{{< artifact flink-connector-rabbitmq withScalaVersion >}}
45+
{{< artifact flink-connector-rabbitmq >}}
4646

4747
Note that the streaming connectors are currently not part of the binary distribution. See linking with them for cluster execution [here]({{< ref "docs/dev/datastream/project-configuration" >}}).
4848

docs/content/docs/connectors/datastream/twitter.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@ The [Twitter Streaming API](https://dev.twitter.com/docs/streaming-apis) provide
3030
Flink Streaming comes with a built-in `TwitterSource` class for establishing a connection to this stream.
3131
To use this connector, add the following dependency to your project:
3232

33-
{{< artifact flink-connector-twitter withScalaVersion >}}
33+
{{< artifact flink-connector-twitter >}}
3434

3535
Note that the streaming connectors are currently not part of the binary distribution.
3636
See linking with them for cluster execution [here]({{< ref "docs/dev/datastream/project-configuration" >}}).

docs/content/docs/dev/dataset/cluster_execution.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -51,7 +51,7 @@ If you are developing your program as a Maven project, you have to add the
5151
```xml
5252
<dependency>
5353
<groupId>org.apache.flink</groupId>
54-
<artifactId>flink-clients{{< scala_version >}}</artifactId>
54+
<artifactId>flink-clients</artifactId>
5555
<version>{{< version >}}</version>
5656
</dependency>
5757
```

docs/content/docs/dev/dataset/local_execution.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -46,7 +46,7 @@ If you are developing your program in a Maven project, you have to add the `flin
4646
```xml
4747
<dependency>
4848
<groupId>org.apache.flink</groupId>
49-
<artifactId>flink-clients{{< scala_version >}}</artifactId>
49+
<artifactId>flink-clients</artifactId>
5050
<version>{{< version >}}</version>
5151
</dependency>
5252
```

docs/content/docs/dev/datastream/fault-tolerance/queryable_state.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -71,7 +71,7 @@ response back to the client.
7171

7272
To enable queryable state on your Flink cluster, you need to do the following:
7373

74-
1. copy the `flink-queryable-state-runtime{{< scala_version >}}-{{< version >}}.jar`
74+
1. copy the `flink-queryable-state-runtime-{{< version >}}.jar`
7575
from the `opt/` folder of your [Flink distribution]({{< downloads >}} "Apache Flink: Downloads"),
7676
to the `lib/` folder.
7777
2. set the property `queryable-state.enable` to `true`. See the [Configuration]({{< ref "docs/deployment/config" >}}#queryable-state) documentation for details and additional parameters.

0 commit comments

Comments
 (0)