ticket的过期策略(5.x)

本文详细介绍了CAS 5.x中Ticket Granting Ticket(TGT)的五种过期策略,包括HardTimeout、NeverExpiresExpirationPolicy、ThrottledUseAndTimeoutExpirationPolicy、TicketGrantingTicketExpirationPolicy和TimeoutExpirationPolicy。每种策略的配置和过期条件都有所不同,如硬过期时间、无过期时间、活动检测等。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

tgt的过期策略:

1.hardTimeout

org.jasig.cas.ticket.support.HardTimeoutExpirationPolicy,它的配置:

<bean id="grantingTicketExpirationPolicy"
	class="org.jasig.cas.ticket.support.HardTimeoutExpirationPolicy">
	<constructor-arg index="0"  value="7200000" /> <!-- 单位是毫秒 -->
</bean>

5.x对应Appliction配置:

cas:
	tgt:
      hardTimeout:
        timeToKillInSeconds: 7200000

在tgt创建之时起,两个小时之后则tgt过期,这个是硬性的没有商量的余地,所以类名有个hard;

2.NeverExpiresExpirationPolicy
org.jasig.cas.ticket.support.NeverExpiresExpirationPolicy,它的配置:

<bean id="grantingTicketExpirationPolicy"
	class="org.jasig.cas.ticket.support.NeverExpiresExpirationPolicy">
</bean>
cas:
	tgt:
      hardTimeout:
        timeToKillInSeconds: 7200000

在tgt创建之时起,两个小时之后则tgt过期,这个是硬性的没有商量的余地,所以类名有个hard;

3.ThrottledUseAndTimeoutExpirationPolicy
org.jasig.cas.ticket.

2025-07-10 10:25:36.743 [TextFileReader -> (Filter -> InfoAppendixParser -> Sink: InfoAppendixDebugSink, Filter -> SaleRecordParser -> Sink: SaleRecordDebugSink) (1/2)#0] INFO com.train.data.processor.ZipFileProcessorJob.DebugSaleRecordSink - SaleRecord Sink Hive配置成功 2025-07-10 10:25:36.743 [TextFileReader -> (Filter -> InfoAppendixParser -> Sink: InfoAppendixDebugSink, Filter -> SaleRecordParser -> Sink: SaleRecordDebugSink) (2/2)#0] INFO com.train.data.processor.ZipFileProcessorJob.DebugSaleRecordSink - SaleRecord Sink Hive配置成功 2025-07-10 10:25:36.743 [TextFileReader -> (Filter -> InfoAppendixParser -> Sink: InfoAppendixDebugSink, Filter -> SaleRecordParser -> Sink: SaleRecordDebugSink) (1/2)#0] INFO com.train.data.processor.ZipFileProcessorJob.DebugSaleRecordSink - 测试表连接,表名: sale_record 2025-07-10 10:25:36.743 [TextFileReader -> (Filter -> InfoAppendixParser -> Sink: InfoAppendixDebugSink, Filter -> SaleRecordParser -> Sink: SaleRecordDebugSink) (2/2)#0] INFO com.train.data.processor.ZipFileProcessorJob.DebugSaleRecordSink - 测试表连接,表名: sale_record 2025-07-10 10:25:36.743 [TextFileReader -> (Filter -> InfoAppendixParser -> Sink: InfoAppendixDebugSink, Filter -> SaleRecordParser -> Sink: SaleRecordDebugSink) (1/2)#0] INFO com.train.data.processor.ZipFileProcessorJob.DebugSaleRecordSink - 执行测试SQL: SELECT COUNT(*) FROM sale_record LIMIT 1 2025-07-10 10:25:36.743 [TextFileReader -> (Filter -> InfoAppendixParser -> Sink: InfoAppendixDebugSink, Filter -> SaleRecordParser -> Sink: SaleRecordDebugSink) (2/2)#0] INFO com.train.data.processor.ZipFileProcessorJob.DebugSaleRecordSink - 执行测试SQL: SELECT COUNT(*) FROM sale_record LIMIT 1 Found ticket for hive/ddp1@HADOOP.COM to go to krbtgt/HADOOP.COM@HADOOP.COM expiring on Fri Jul 11 10:25:25 CST 2025 Entered Krb5Context.initSecContext with state=STATE_NEW Service ticket not found in the subject >>> Credentials serviceCredsSingle: same realm default etypes for default_tgs_enctypes: 17. >>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType >>> CksumType: sun.security.krb5.internal.crypto.HmacSha1Aes128CksumType >>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType >>> KrbKdcReq send: kdc=192.168.100.219 TCP:88, timeout=30000, number of retries =3, #bytes=607 >>> KDCCommunication: kdc=192.168.100.219 TCP:88, timeout=30000,Attempt =1, #bytes=607 Found ticket for hive/ddp1@HADOOP.COM to go to krbtgt/HADOOP.COM@HADOOP.COM expiring on Fri Jul 11 10:25:25 CST 2025 Entered Krb5Context.initSecContext with state=STATE_NEW Service ticket not found in the subject >>> Credentials serviceCredsSingle: same realm default etypes for default_tgs_enctypes: 17. >>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType >>> CksumType: sun.security.krb5.internal.crypto.HmacSha1Aes128CksumType >>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType >>> KrbKdcReq send: kdc=192.168.100.219 TCP:88, timeout=30000, number of retries =3, #bytes=607 >>> KDCCommunication: kdc=192.168.100.219 TCP:88, timeout=30000,Attempt =1, #bytes=607 >>>DEBUG: TCPClient reading 602 bytes >>> KrbKdcReq send: #bytes read=602 >>> KdcAccessibility: remove 192.168.100.219:88 >>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType >>> TGS credentials serviceCredsSingle: >>> DEBUG: ----Credentials---- client: hive/ddp1@HADOOP.COM server: hive/ddp2@HADOOP.COM ticket: sname: hive/ddp2@HADOOP.COM startTime: 1752114331000 endTime: 1752200725000 ----Credentials end---- >>> KrbApReq: APOptions are 00100000 00000000 00000000 00000000 >>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType Krb5Context setting mySeqNumber to: 971222954 Created InitSecContextToken: 0000: 01 00 6E 82 02 1D 30 82 02 19 A0 03 02 01 05 A1 ..n...0......... 0010: 03 02 01 0E A2 07 03 05 00 20 00 00 00 A3 82 01 ......... ...... 0020: 3E 61 82 01 3A 30 82 01 36 A0 03 02 01 05 A1 0C >a..:0..6....... 0030: 1B 0A 48 41 44 4F 4F 50 2E 43 4F 4D A2 17 30 15 ..HADOOP.COM..0. 0040: A0 03 02 01 00 A1 0E 30 0C 1B 04 68 69 76 65 1B .......0...hive. 0050: 04 64 64 70 32 A3 82 01 06 30 82 01 02 A0 03 02 .ddp2....0...... 0060: 01 12 A1 03 02 01 01 A2 81 F5 04 81 F2 60 4C 6A .............`Lj 0070: A1 C5 6A 22 24 F5 D7 DC 8E 1E 72 8C E4 50 5B 94 ..j"$.....r..P[. 0080: C7 A2 72 ED 0A 3A E2 71 23 E5 28 CB 5C C4 9D EC ..r..:.q#.(.\... 0090: BE AE 14 8F 44 64 CC C2 57 77 BF D9 5E E6 22 90 ....Dd..Ww..^.". 00A0: B9 83 EB 9E 3F BD 8F 1F 8D 0F E2 AD 14 09 E2 CA ....?........... 00B0: C7 1A 43 D7 5C 2B AF 88 E9 54 E7 8D 5F C6 C5 11 ..C.\+...T.._... 00C0: 35 85 31 5C B9 3B 41 2C DF F3 66 CA 60 7E CC 9A 5.1\.;A,..f.`... 00D0: 28 2B AD CE 59 7A 3E 3E 31 DC 65 76 85 B9 2C 63 (+..Yz>>1.ev..,c 00E0: B9 B7 46 7B 46 4C FD 8B 15 8C 64 E1 0D DE 3E B4 ..F.FL....d...>. 00F0: F5 19 68 B5 AD 11 97 14 94 85 1C BC 26 EF 3E 3E ..h.........&.>> 0100: 67 86 F5 40 A4 5B 63 65 A4 DD 29 5B 6E 2E 25 D8 g..@.[ce..)[n.%. 0110: BF 3E 4B A5 D1 AD 49 B6 73 B8 7F 90 37 6C A9 08 .>K...I.s...7l.. 0120: 8E 25 21 1F 8B C0 38 9D 55 43 98 19 35 6C 86 C2 .%!...8.UC..5l.. 0130: 1D CE 9E 95 32 12 E9 FC D9 9F 86 1E 07 BC 68 37 ....2.........h7 0140: E0 B1 B7 41 AA 04 FD 87 0E EF 05 2D 7D 2C 98 9A ...A.......-.,.. 0150: 7A 43 B0 2F 6B 47 DF 21 6A A5 7D FE 57 23 D2 A4 zC./kG.!j...W#.. 0160: 81 C1 30 81 BE A0 03 02 01 11 A2 81 B6 04 81 B3 ..0............. 0170: F8 E0 32 ED 94 0B 4D 8B FC 7B 3F 27 2C E8 DB 06 ..2...M...?',... 0180: 90 87 15 39 2E B5 F6 C0 15 34 13 EB 56 97 1C C5 ...9.....4..V... 0190: 26 9A 1F AE 3C CB D0 51 3F B3 89 35 F7 45 8C 18 &...<..Q?..5.E.. 01A0: A2 55 71 8C 0D 85 84 E8 98 67 75 74 4E 95 C4 61 .Uq......gutN..a 01B0: 8F 63 40 12 F3 77 8A 2F CE A4 A8 7F 7D 99 16 74 .c@..w./.......t 01C0: 90 26 91 C4 2D AD D1 33 64 7E 88 18 5C 67 A4 79 .&..-..3d...\g.y 01D0: A3 50 61 6F 5C 2E F1 2A AE 97 B8 9D 01 C8 4C 9A .Pao\..*......L. 01E0: C5 DA E4 3A 30 8C 2B 12 CC 38 F1 83 B6 72 C0 D0 ...:0.+..8...r.. 01F0: D5 F9 B9 60 D1 97 9F 00 28 8C 65 75 E4 4F 21 35 ...`....(.eu.O!5 0200: 62 8C 6C F8 35 FC 7E 10 80 8B 74 C7 89 62 C8 55 b.l.5.....t..b.U 0210: E5 D4 B4 2D 73 C1 5B 21 36 B1 44 83 FD 4F 84 40 ...-s.[!6.D..O.@ 0220: C3 C9 8D ... >>>DEBUG: TCPClient reading 602 bytes >>> KrbKdcReq send: #bytes read=602 >>> KdcAccessibility: remove 192.168.100.219:88 >>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType >>> TGS credentials serviceCredsSingle: >>> DEBUG: ----Credentials---- client: hive/ddp1@HADOOP.COM server: hive/ddp2@HADOOP.COM ticket: sname: hive/ddp2@HADOOP.COM startTime: 1752114331000 endTime: 1752200725000 ----Credentials end---- >>> KrbApReq: APOptions are 00100000 00000000 00000000 00000000 >>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType Krb5Context setting mySeqNumber to: 885422836 Created InitSecContextToken: 0000: 01 00 6E 82 02 1D 30 82 02 19 A0 03 02 01 05 A1 ..n...0......... 0010: 03 02 01 0E A2 07 03 05 00 20 00 00 00 A3 82 01 ......... ...... 0020: 3E 61 82 01 3A 30 82 01 36 A0 03 02 01 05 A1 0C >a..:0..6....... 0030: 1B 0A 48 41 44 4F 4F 50 2E 43 4F 4D A2 17 30 15 ..HADOOP.COM..0. 0040: A0 03 02 01 00 A1 0E 30 0C 1B 04 68 69 76 65 1B .......0...hive. 0050: 04 64 64 70 32 A3 82 01 06 30 82 01 02 A0 03 02 .ddp2....0...... 0060: 01 12 A1 03 02 01 01 A2 81 F5 04 81 F2 1A 62 BB ..............b. 0070: C8 84 AE 1B 42 00 65 82 53 92 5B 34 6D 85 D7 CA ....B.e.S.[4m... 0080: 2B FA 21 7E 4C CC AE B9 81 DE 48 0B A4 31 C6 28 +.!.L.....H..1.( 0090: 8D E9 8F 68 BB 98 51 66 37 3C 70 2C A6 ED CD D0 ...h..Qf7<p,.... 00A0: CC 73 4D 2B 29 71 5D 64 05 BE 76 D6 08 9B FE 74 .sM+)q]d..v....t 00B0: 75 47 8E C6 9E BB 9B CD CF A1 55 83 33 62 DE B3 uG........U.3b.. 00C0: E0 CD 79 3F E4 89 8A 43 00 67 79 73 76 D9 82 A9 ..y?...C.gysv... 00D0: 9F E8 A1 52 4E C6 0B BD 96 6B B8 6A 4F C1 D3 52 ...RN....k.jO..R 00E0: 2A D0 E6 C8 54 CF 67 9F 48 AF E7 31 7D 97 F0 4A *...T.g.H..1...J 00F0: 7D 81 35 6A 2A 23 45 18 AF F8 47 09 47 B6 0C 42 ..5j*#E...G.G..B 0100: 95 AA 6F 0F 01 F9 A2 D2 12 91 3A D1 54 C1 C7 CD ..o.......:.T... 0110: CF 11 2A EE 7C 40 6A D0 E5 A2 7A 25 F0 82 B7 E0 ..*..@j...z%.... 0120: 77 D8 DD 36 ED 09 A9 2E 59 45 5D B2 1E 64 13 FD w..6....YE]..d.. 0130: 1B 8E 7A CC 44 B6 14 DC 61 8F 97 01 93 A7 2E A3 ..z.D...a....... 0140: F4 28 61 EF F7 5F DE 4C 1D 8F 98 BD B2 3E A1 3D .(a.._.L.....>.= 0150: 68 4C D0 DA 8D 39 AC 7C D0 2E B3 EE 30 BE A4 A4 hL...9......0... 0160: 81 C1 30 81 BE A0 03 02 01 11 A2 81 B6 04 81 B3 ..0............. 0170: 4B 4F D3 3A EA CE 49 4B F9 62 46 51 7D 19 FB 97 KO.:..IK.bFQ.... 0180: AF 42 F9 E3 B6 83 E6 88 24 EC 4B 58 75 B7 51 47 .B......$.KXu.QG 0190: 75 72 68 53 5C 38 48 78 27 58 2E B1 6B 6B 1D E4 urhS\8Hx'X..kk.. 01A0: 7C C5 9F 04 74 76 91 5A CA 74 93 11 14 40 87 7C ....tv.Z.t...@.. 01B0: 49 81 0B EE B6 AE 9E D0 4C 61 0E 11 6E A8 FE A4 I.......La..n... 01C0: 24 68 1B 74 C3 46 58 A7 15 5D 74 A8 11 7E 4C FD $h.t.FX..]t...L. 01D0: D4 85 D7 8B A0 CD 28 3F 9B 9A 34 81 58 98 19 AB ......(?..4.X... 01E0: 09 D7 5B AF AC 8C 45 4B 45 79 E3 E3 DD 2A A0 E7 ..[...EKEy...*.. 01F0: 4F 61 53 9B 1D BB A2 A2 1C 0B 81 D5 8B D7 5D 39 OaS...........]9 0200: 9F 45 55 57 5A E0 18 48 3F 46 19 1D 35 A2 44 EA .EUWZ..H?F..5.D. 0210: 9C D6 E5 15 6B 82 1D 52 6D BB B8 75 2B 75 9E F5 ....k..Rm..u+u.. 0220: 6E E2 8F n.. Entered Krb5Context.initSecContext with state=STATE_IN_PROCESS >>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType Krb5Context setting peerSeqNumber to: 310403291 Entered Krb5Context.initSecContext with state=STATE_IN_PROCESS Krb5Context.unwrap: token=[05 04 01 ff 00 0c 00 00 00 00 00 00 12 80 60 db 01 01 00 00 f3 9f f7 b2 a6 31 5c 10 34 31 98 e1 ] >>> EType: sun.security.krb5.internal.crypto.Aes128CtsHmacSha1EType Krb5Context setting peerSeqNumber to: 1005687136 Krb5Context.wrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 39 e3 af aa 01 01 00 00 41 36 c1 5b db 1a f4 d5 57 09 76 9c ] Krb5Context.unwrap: token=[05 04 01 ff 00 0c 00 00 00 00 00 00 3b f1 91 60 01 01 00 00 12 d2 5b 2e 08 4d 97 c8 c1 0e b3 d4 ] Krb5Context.wrap: token=[05 04 00 ff 00 0c 00 00 00 00 00 00 34 c6 7a f4 01 01 00 00 f9 46 ee a7 4a 98 77 46 ad 2c 5c 70 ] Java config name: D:/Development/GZKY/cursor-flink-hive/src/main/resources/krb5.conf Loaded from Java config >>> KdcAccessibility: reset Java config name: D:/Development/GZKY/cursor-flink-hive/src/main/resources/krb5.conf >>>KinitOptions cache name is C:\Users\H1994\krb5cc_H1994 >> Acquire default native Credentials Loaded from Java config >>> KdcAccessibility: reset default etypes for default_tkt_enctypes: 17. >>> Found no TGT's in LSA >>>KinitOptions cache name is C:\Users\H1994\krb5cc_H1994 >> Acquire default native Credentials default etypes for default_tkt_enctypes: 17. >>> Found no TGT's in LSA 2025-07-10 10:25:38.368 [TextFileReader -> (Filter -> InfoAppendixParser -> Sink: InfoAppendixDebugSink, Filter -> SaleRecordParser -> Sink: SaleRecordDebugSink) (2/2)#0] ERROR com.train.data.processor.ZipFileProcessorJob.DebugSaleRecordSink - 表连接测试失败: org.apache.flink.table.api.TableException: Failed to execute sql at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeQueryOperation(TableEnvironmentImpl.java:1084) ~[flink-table-api-java-1.18.1.jar:1.18.1] at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:1120) ~[flink-table-api-java-1.18.1.jar:1.18.1] at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:735) ~[flink-table-api-java-1.18.1.jar:1.18.1] at com.train.data.processor.ZipFileProcessorJob$DebugSaleRecordSink.lambda$testTableConnection$0(ZipFileProcessorJob.java:774) ~[classes/:?] at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_441] at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_441] at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878) ~[hadoop-common-3.3.3.jar:?] at com.train.data.utils.KerberosUtils.doAs(KerberosUtils.java:468) ~[classes/:?] at com.train.data.processor.ZipFileProcessorJob$DebugSaleRecordSink.testTableConnection(ZipFileProcessorJob.java:773) ~[classes/:?] at com.train.data.processor.ZipFileProcessorJob$DebugSaleRecordSink.open(ZipFileProcessorJob.java:706) ~[classes/:?] at org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:34) ~[flink-core-1.18.1.jar:1.18.1] at org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.open(AbstractUdfStreamOperator.java:101) ~[flink-streaming-java-1.18.1.jar:1.18.1] at org.apache.flink.streaming.api.operators.StreamSink.open(StreamSink.java:46) ~[flink-streaming-java-1.18.1.jar:1.18.1] at org.apache.flink.streaming.runtime.tasks.RegularOperatorChain.initializeStateAndOpenOperators(RegularOperatorChain.java:107) ~[flink-streaming-java-1.18.1.jar:1.18.1] at org.apache.flink.streaming.runtime.tasks.StreamTask.restoreGates(StreamTask.java:753) ~[flink-streaming-java-1.18.1.jar:1.18.1] at org.apache.flink.streaming.runtime.tasks.StreamTaskActionExecutor$1.call(StreamTaskActionExecutor.java:55) ~[flink-streaming-java-1.18.1.jar:1.18.1] at org.apache.flink.streaming.runtime.tasks.StreamTask.restoreInternal(StreamTask.java:728) ~[flink-streaming-java-1.18.1.jar:1.18.1] at org.apache.flink.streaming.runtime.tasks.StreamTask.restore(StreamTask.java:693) ~[flink-streaming-java-1.18.1.jar:1.18.1] at org.apache.flink.runtime.taskmanager.Task.runWithSystemExitMonitoring(Task.java:953) ~[flink-runtime-1.18.1.jar:1.18.1] at org.apache.flink.runtime.taskmanager.Task.restoreAndInvoke(Task.java:922) ~[flink-runtime-1.18.1.jar:1.18.1] at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:746) ~[flink-runtime-1.18.1.jar:1.18.1] at org.apache.flink.runtime.taskmanager.Task.run(Task.java:562) ~[flink-runtime-1.18.1.jar:1.18.1] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_441] Caused by: org.apache.flink.util.FlinkRuntimeException: org.apache.hadoop.security.KerberosAuthException: failure to login: for principal: hive/ddp1@HADOOP.COM javax.security.auth.login.LoginException: Unable to obtain password from user at org.apache.flink.runtime.security.token.DefaultDelegationTokenManager.lambda$obtainDelegationTokensAndGetNextRenewal$1(DefaultDelegationTokenManager.java:281) ~[flink-runtime-1.18.1.jar:1.18.1] at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) ~[?:1.8.0_441] at java.util.HashMap$ValueSpliterator.forEachRemaining(HashMap.java:1652) ~[?:1.8.0_441] at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482) ~[?:1.8.0_441] at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472) ~[?:1.8.0_441] at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[?:1.8.0_441] at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:1.8.0_441] at java.util.stream.ReferencePipeline.reduce(ReferencePipeline.java:479) ~[?:1.8.0_441] at java.util.stream.ReferencePipeline.min(ReferencePipeline.java:520) ~[?:1.8.0_441] at org.apache.flink.runtime.security.token.DefaultDelegationTokenManager.obtainDelegationTokensAndGetNextRenewal(DefaultDelegationTokenManager.java:286) ~[flink-runtime-1.18.1.jar:1.18.1] at org.apache.flink.runtime.security.token.DefaultDelegationTokenManager.obtainDelegationTokens(DefaultDelegationTokenManager.java:242) ~[flink-runtime-1.18.1.jar:1.18.1] at org.apache.flink.runtime.minicluster.MiniCluster.start(MiniCluster.java:430) ~[flink-runtime-1.18.1.jar:1.18.1] at org.apache.flink.client.program.PerJobMiniClusterFactory.submitJob(PerJobMiniClusterFactory.java:77) ~[flink-clients-1.18.1.jar:1.18.1] at org.apache.flink.client.deployment.executors.LocalExecutor.execute(LocalExecutor.java:85) ~[flink-clients-1.18.1.jar:1.18.1] at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.executeAsync(StreamExecutionEnvironment.java:2238) ~[flink-streaming-java-1.18.1.jar:1.18.1] at org.apache.flink.table.planner.delegation.DefaultExecutor.executeAsync(DefaultExecutor.java:110) ~[?:?] at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeQueryOperation(TableEnvironmentImpl.java:1065) ~[flink-table-api-java-1.18.1.jar:1.18.1] ... 22 more
最新发布
07-11
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值