antlr (updating)

grun

运行grun Hello r -tree出错:Can’t load Hello as lexer or parser

Getting Started with ANTLR v4

去运行到:

? 1 grun Hello r -tree

结果出错:

Can’t load Hello as lexer or parser

【解决过程】

1.另外一台,同样的电脑中,是没有此错误的。

2.当前出错的电脑中,java是1.6的u38版本的,所以换用和另外一台电脑中的,同样的版本的java 1.6.0:

java version "1.6.0"

Java(TM) SE Runtime Environment (build 1.6.0-b105)

Java HotSpot(TM) 64-Bit Server VM (build 1.6.0-b105, mixed mode)

结果问题依旧。

【总结】

真的不知道是什么原因。感觉像是ANTLR4,不稳定所导致的。

【后记 2013-01-24】

后来得知,原来是设置CLASSPATH,把最开始前面的那个点,表示当前路径的那个字符点"."不小心忘掉了:

;%JAVA_HOME%\jre\lib\rt.jar;D:\DevTool\DD_Parser\Parser\ANTLR\antlr\antlr-4.0-complete.jar;.

加上对应点,变成:

.;%JAVA_HOME%\jre\lib\rt.jar;D:\DevTool\DD_Parser\Parser\ANTLR\antlr\antlr-4.0-complete.jar;

就可以了。

然后才想通了,其实加上了点,表示从当前路径找,这样才能找到对应的:

HelloParser.class

HelloLexer.class

然后才能正常执行。

没加点,所以就找不到lexer or parser

运行grun Hello r -tree出错:Can’t load Hello as lexer or parser

Getting Started with ANTLR v4

去运行到:

1

grun Hello r -tree

结果出错:

Can’t load Hello as lexer or parser

【解决过程】

1.另外一台,同样的电脑中,是没有此错误的。

2.当前出错的电脑中,java是1.6的u38版本的,所以换用和另外一台电脑中的,同样的版本的java 1.6.0:

java version "1.6.0"

Java(TM) SE Runtime Environment (build 1.6.0-b105)

Java HotSpot(TM) 64-Bit Server VM (build 1.6.0-b105, mixed mode)

结果问题依旧。

【总结】

真的不知道是什么原因。感觉像是ANTLR4,不稳定所导致的。


【后记 2013-01-24】

后来得知,原来是设置CLASSPATH,把最开始前面的那个点,表示当前路径的那个字符点"."不小心忘掉了:

1

;%JAVA_HOME%\jre\lib\rt.jar;D:\DevTool\DD_Parser\Parser\ANTLR\antlr\antlr-4.0-complete.jar;

加上对应点,变成:

1

.;%JAVA_HOME%\jre\lib\rt.jar;D:\DevTool\DD_Parser\Parser\ANTLR\antlr\antlr-4.0-complete.jar;

就可以了。

然后才想通了,其实加上了点,表示从当前路径找,这样才能找到对应的:

HelloParser.class

HelloLexer.class

然后才能正常执行。

没加点,所以就找不到lexer or parser

hive> create databases bclcredits; 2025-06-18 16:00:45,495 INFO conf.HiveConf: Using the default value passed in for log id: 949847db-5ab3-4891-b83b-d3071c4b727b 2025-06-18 16:00:45,495 INFO session.SessionState: Updating thread name to 949847db-5ab3-4891-b83b-d3071c4b727b main 2025-06-18 16:00:45,496 INFO ql.Driver: Compiling command(queryId=aaa_20250618160045_00ce6840-bf74-4e81-a3c7-f7ede97059fb): create databases bclcredits NoViableAltException(85@[917:1: ddlStatement : ( createDatabaseStatement | switchDatabaseStatement | dropDatabaseStatement | createTableStatement | dropTableStatement | truncateTableStatement | alterStatement | descStatement | showStatement | metastoreCheck | createViewStatement | createMaterializedViewStatement | dropViewStatement | dropMaterializedViewStatement | createFunctionStatement | createMacroStatement | dropFunctionStatement | reloadFunctionStatement | dropMacroStatement | analyzeStatement | lockStatement | unlockStatement | lockDatabase | unlockDatabase | createRoleStatement | dropRoleStatement | ( grantPrivileges )=> grantPrivileges | ( revokePrivileges )=> revokePrivileges | showGrants | showRoleGrants | showRolePrincipals | showRoles | grantRole | revokeRole | setRole | showCurrentRole | abortTransactionStatement | killQueryStatement | resourcePlanDdlStatements );]) at org.antlr.runtime.DFA.noViableAlt(DFA.java:158) at org.antlr.runtime.DFA.predict(DFA.java:116) at org.apache.hadoop.hive.ql.parse.HiveParser.ddlStatement(HiveParser.java:4244) at org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:2494) at org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:1420) at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:220) at org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:74) at org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:67) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:616) at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1826) at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1773) at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1768) at org.apache.hadoop.hive.ql.reexec.ReExecDriver.compileAndRespond(ReExecDriver.java:126) at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:214) at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:188) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:402) at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.util.RunJar.run(RunJar.java:323) at org.apache.hadoop.util.RunJar.main(RunJar.java:236) FAILED: ParseException line 1:7 cannot recognize input near 'create' 'databases' 'bclcredits' in ddl statement 2025-06-18 16:00:45,520 ERROR ql.Driver: FAILED: ParseException line 1:7 cannot recognize input near 'create' 'databases' 'bclcredits' in ddl statement org.apache.hadoop.hive.ql.parse.ParseException: line 1:7 cannot recognize input near 'create' 'databases' 'bclcredits' in ddl statement at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:223) at org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:74) at org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:67) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:616) at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1826) at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1773) at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1768) at org.apache.hadoop.hive.ql.reexec.ReExecDriver.compileAndRespond(ReExecDriver.java:126) at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:214) at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:188) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:402) at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.util.RunJar.run(RunJar.java:323) at org.apache.hadoop.util.RunJar.main(RunJar.java:236) 2025-06-18 16:00:45,521 INFO ql.Driver: Completed compiling command(queryId=aaa_20250618160045_00ce6840-bf74-4e81-a3c7-f7ede97059fb); Time taken: 0.025 seconds 2025-06-18 16:00:45,521 INFO ql.Driver: Concurrency mode is disabled, not creating a lock manager 2025-06-18 16:00:45,521 INFO conf.HiveConf: Using the default value passed in for log id: 949847db-5ab3-4891-b83b-d3071c4b727b 2025-06-18 16:00:45,521 INFO session.SessionState: Resetting thread name to main
06-19
^C Running command git fetch -q --tags Running command git reset --hard -q cc1be01b97b79b6afb7a35f164d5b2f14b00b50d Defaulting to user installation because normal site-packages is not writeable Looking in indexes: https://mirrors.aliyun.com/pypi/simple/ Obtaining embedding_dataset_reordering from git+https://github.com/Veldrovive/embedding-dataset-reordering.git@main#egg=embedding_dataset_reordering (from -r requirements.txt (line 52)) Updating e:\jupyter notebook\src\embedding-dataset-reordering clone (to revision main) Preparing metadata (setup.py): started Preparing metadata (setup.py): finished with status 'done' Collecting absl-py==1.0.0 (from -r requirements.txt (line 1)) Using cached https://mirrors.aliyun.com/pypi/packages/2c/03/e3e19d3faf430ede32e41221b294e37952e06acc96781c417ac25d4a0324/absl_py-1.0.0-py3-none-any.whl (126 kB) Collecting accelerate==0.25.0 (from -r requirements.txt (line 2)) Using cached https://mirrors.aliyun.com/pypi/packages/f7/fc/c55e5a2da345c9a24aa2e1e0f60eb2ca290b6a41be82da03a6d4baec4f99/accelerate-0.25.0-py3-none-any.whl (265 kB) Collecting addict==2.4.0 (from -r requirements.txt (line 3)) Using cached https://mirrors.aliyun.com/pypi/packages/6a/00/b08f23b7d7e1e14ce01419a467b583edbb93c6cdb8654e54a9cc579cd61f/addict-2.4.0-py3-none-any.whl (3.8 kB) Collecting aiohttp==3.8.1 (from -r requirements.txt (line 4)) Using cached https://mirrors.aliyun.com/pypi/packages/5a/86/5f63de7a202550269a617a5d57859a2961f3396ecd1739a70b92224766bc/aiohttp-3.8.1.tar.gz (7.3 MB) Installing build dependencies: started Installing build dependencies: finished with status 'done' Getting requirements to build wheel: started Getting requirements to build wheel: finished with status 'done' Preparing metadata (pyproject.toml): started Preparing metadata (pyproject.toml): finished with status 'done' Requirement already satisfied: aiosignal==1.2.0 in e:\anaconda3\lib\site-packages (from -r requirements.txt (line 5)) (1.2.0) Collecting albumentations==1.3.0 (from -r requirements.txt (line 6)) Using cached https://mirrors.aliyun.com/pypi/packages/4f/55/3c2ce84c108fc1d422afd6de153e4b0a3e6f96ecec4cb9afcf0284ce3538/albumentations-1.3.0-py3-none-any.whl (123 kB) Collecting aniso8601==9.0.1 (from -r requirements.txt (line 7)) Using cached https://mirrors.aliyun.com/pypi/packages/e3/04/e97c12dc034791d7b504860acfcdd2963fa21ae61eaca1c9d31245f812c3/aniso8601-9.0.1-py2.py3-none-any.whl (52 kB) Collecting anndata==0.8.0 (from -r requirements.txt (line 8)) Using cached https://mirrors.aliyun.com/pypi/packages/46/7f/ffe1546142d98ed55e7bb70eaedad92861d8e2ab07398ef7f06f4f46d06d/anndata-0.8.0-py3-none-any.whl (96 kB) Collecting antlr4-python3-runtime==4.8 (from -r requirements.txt (line 9)) Using cached https://mirrors.aliyun.com/pypi/packages/56/02/789a0bddf9c9b31b14c3e79ec22b9656185a803dc31c15f006f9855ece0d/antlr4-python3-runtime-4.8.tar.gz (112 kB) Preparing metadata (setup.py): started Preparing metadata (setup.py): finished with status 'done' Collecting anyio==3.5.0 (from -r requirements.txt (line 10)) Using cached https://mirrors.aliyun.com/pypi/packages/b1/ae/9a8af72d6f0c551943903eefcf93c3a29898fb7b594603c0d70679c199b1/anyio-3.5.0-py3-none-any.whl (79 kB) Requirement already satisfied: argon2-cffi-bindings==21.2.0 in e:\anaconda3\lib\site-packages (from -r requirements.txt (line 11)) (21.2.0) Requirement already satisfied: argon2-cffi==21.3.0 in e:\anaconda3\lib\site-packages (from -r requirements.txt (line 12)) (21.3.0) Collecting astroid==2.9.3 (from -r requirements.txt (line 13)) Using cached https://mirrors.aliyun.com/pypi/packages/1e/9f/b62617634a9b1c5992ecfb4d63190b9e37582d7dd6510ad9421b84eb9268/astroid-2.9.3-py3-none-any.whl (254 kB) Collecting async-timeout==4.0.2 (from -r requirements.txt (line 14)) Using cached https://mirrors.aliyun.com/pypi/packages/d6/c1/8991e7c5385b897b8c020cdaad718c5b087a6626d1d11a23e1ea87e325a7/async_timeout-4.0.2-py3-none-any.whl (5.8 kB) Collecting attrs==20.3.0 (from -r requirements.txt (line 15)) Using cached https://mirrors.aliyun.com/pypi/packages/c3/aa/cb45262569fcc047bf070b5de61813724d6726db83259222cd7b4c79821a/attrs-20.3.0-py2.py3-none-any.whl (49 kB) Collecting autofaiss==2.15.8 (from -r requirements.txt (line 16)) Using cached https://mirrors.aliyun.com/pypi/packages/f1/ae/04ffd80004a667a5b9521cdf3048f283171b0d3f64dc6f99bad397f47a62/autofaiss-2.15.8-py3-none-any.whl (70 kB) Collecting babel==2.9.1 (from -r requirements.txt (line 17)) Using cached https://mirrors.aliyun.com/pypi/packages/aa/96/4ba93c5f40459dc850d25f9ba93f869a623e77aaecc7a9344e19c01942cf/Babel-2.9.1-py2.py3-none-any.whl (8.8 MB) Collecting backcall==0.2.0 (from -r requirements.txt (line 18)) Using cached https://mirrors.aliyun.com/pypi/packages/4c/1c/ff6546b6c12603d8dd1070aa3c3d273ad4c07f5771689a7b69a550e8c951/backcall-0.2.0-py2.py3-none-any.whl (11 kB) Collecting beautifulsoup4==4.11.1 (from -r requirements.txt (line 19)) Using cached https://mirrors.aliyun.com/pypi/packages/9c/d8/909c4089dbe4ade9f9705f143c9f13f065049a9d5e7d34c828aefdd0a97c/beautifulsoup4-4.11.1-py3-none-any.whl (128 kB) Collecting black==22.6.0 (from -r requirements.txt (line 20)) Using cached https://mirrors.aliyun.com/pypi/packages/2b/70/1d0e33a4df4ed73e9f02f698a29b5d94ff58e39f029c939ecf96a10fb1f3/black-22.6.0-py3-none-any.whl (156 kB) Requirement already satisfied: bleach==4.1.0 in e:\anaconda3\lib\site-packages (from -r requirements.txt (line 21)) (4.1.0) 我在jupyter notebook中运行完pip install --no-deps -r requirements.txt后提示如下 怎么解决
最新发布
06-23
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

大怀特

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值