Scala<产生min-max之间的随机数>

本文介绍了一段用于生成1到35范围内随机整数的代码实现。使用了(newRandom).nextInt方法来产生一个大于等于0且小于指定上限的随机数。

下面的代码是产生1-35范围内的整数

   while (true) {
      var a = 1 + (new Random).nextInt((35 - 1) + 1)
      if (a <= 0 || a >= 36)
        print(a)
    }


(new Random).nextInt(n)会产生一个大于等于0且小于n的整数

<?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd"> <!-- 继承父项目 --> <parent> <groupId>com.xixi</groupId> <artifactId>dataBase</artifactId> <version>0.0.1-SNAPSHOT</version> <relativePath>../pom.xml</relativePath> <!-- 父项目 pom 相对路径 --> </parent> <modelVersion>4.0.0</modelVersion> <artifactId>flink-job</artifactId> <!-- 子模块 artifactId --> <name>flink-job</name> <description>Flink Job Module</description> <!-- 子模块依赖(继承父项目版本) --> <dependencies> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-connector-base</artifactId> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-streaming-java</artifactId> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-clients</artifactId> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-connector-kafka</artifactId> </dependency> <dependency> <groupId>com.alibaba.fastjson2</groupId> <artifactId>fastjson2</artifactId> </dependency> <dependency> <groupId>com.alibaba.fastjson2</groupId> <artifactId>fastjson2-extension</artifactId> </dependency> <dependency> <groupId>com.alibaba.fastjson2</groupId> <artifactId>fastjson2-extension-spring6</artifactId> </dependency> <dependency> <groupId>commons-io</groupId> <artifactId>commons-io</artifactId> </dependency> <dependency> <groupId>com.xixi</groupId> <artifactId>common</artifactId> <!-- 与 common 模块的 artifactId 一致 --> </dependency> </dependencies> <build> <plugins> <!-- 编译插件(继承父项目配置) --> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-compiler-plugin</artifactId> <configuration> <source>17</source> <target>17</target> </configuration> </plugin> <!-- 可选:Spring Boot 插件(若需要本地运行 Flink 作业) --> <plugin> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-maven-plugin</artifactId> <configuration> <excludes> <exclude> <groupId>org.projectlombok</groupId> <artifactId>lombok</artifactId> </exclude> </excludes> </configuration> </plugin> </plugins> </build> </project><?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <parent> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-parent</artifactId> <version>3.5.0</version> <relativePath/> <!-- lookup parent from repository --> </parent> <packaging>pom</packaging> <modules> <!-- 子模块名称,对应目录名(如 flink-job) --> <module>flink-job</module> <module>common</module> <!-- 若有其他子模块(如 canal、service),也需在此声明 --> </modules> <groupId>com.xixi</groupId> <artifactId>dataBase</artifactId> <version>0.0.1-SNAPSHOT</version> <name>dataBase</name> <description>dataBase</description> <properties> <java.version>17</java.version> <flink.version>2.0.0</flink.version> <fastjson.version>2.0.49</fastjson.version> </properties> <dependencyManagement> <dependencies> <!-- Flink --> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-connector-base</artifactId> <version>${flink.version}</version> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-streaming-java</artifactId> <version>${flink.version}</version> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-clients</artifactId> <version>${flink.version}</version> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-connector-kafka</artifactId> <version>4.0.0-2.0</version> </dependency> <!-- JSON 解析器和生成器 --> <dependency> <groupId>com.alibaba.fastjson2</groupId> <artifactId>fastjson2</artifactId> <version>${fastjson.version}</version> </dependency> <dependency> <groupId>com.alibaba.fastjson2</groupId> <artifactId>fastjson2-extension</artifactId> <version>${fastjson.version}</version> </dependency> <dependency> <groupId>com.alibaba.fastjson2</groupId> <artifactId>fastjson2-extension-spring6</artifactId> <version>${fastjson.version}</version> </dependency> <!-- kafka --> <dependency> <groupId>org.apache.kafka</groupId> <artifactId>kafka-clients</artifactId> <version>3.9.1</version> </dependency> <!-- mysql --> <dependency> <groupId>com.mysql</groupId> <artifactId>mysql-connector-j</artifactId> <version>8.2.0</version> <scope>runtime</scope> </dependency> <!-- MyBatis-Plus 核心 --> <dependency> <groupId>com.baomidou</groupId> <artifactId>mybatis-plus-spring-boot3-starter</artifactId> <version>3.5.11</version> </dependency> <!-- Spring Kafka --> <dependency> <groupId>org.springframework.kafka</groupId> <artifactId>spring-kafka</artifactId> </dependency> <!-- Lombok(简化实体类) --> <dependency> <groupId>org.projectlombok</groupId> <artifactId>lombok</artifactId> <version>1.18.26</version> <optional>true</optional> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-web</artifactId> </dependency> <dependency> <groupId>commons-io</groupId> <artifactId>commons-io</artifactId> <version>2.16.0</version> <!-- 使用最新版本,但注意与Flink版本的兼容性 --> </dependency> </dependencies> </dependencyManagement> <build> <plugins> <plugin> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-maven-plugin</artifactId> </plugin> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-compiler-plugin</artifactId> <configuration> <source>${java.version}</source> <target>${java.version}</target> </configuration> </plugin> </plugins> </build> <repositories> <repository> <id>central</id> <url>https://repo.maven.apache.org/maven2</url> </repository> </repositories> </project> 我的pom有什么问题嘛,为什么我的flinkjob读不上依赖呢
最新发布
06-14
<think>我们正在解决一个关于FlinkJob模块依赖配置的问题,特别是pom文件中的配置问题导致无法正确读取依赖。根据提供的引用,特别是引用[5]中提到了一个类似问题:Scala类没有被打进jar包,原因是maven默认不打包Scala代码,需要添加scala-maven-plugin插件。用户的问题:检查pom文件配置,解决FlinkJob模块无法正确读取依赖的问题。我们需要分析pom文件中可能存在的依赖配置问题,并提供解决方案。常见问题包括:1.依赖项未正确声明或版本冲突。2.构建插件配置不正确,导致打包时缺失必要的类(如Scala类)。3.依赖作用域(scope)设置不当。4.多模块项目中模块依赖未正确配置。考虑到引用[5]中提到的错误(jar包中缺少Scala类),我们应重点检查打包插件配置。解决方案步骤:1.确保所有必需的依赖项都在pom文件的<dependencies>部分正确声明,并注意作用域(scope)。2.如果使用了Scala,确保使用了合适的scala-maven-plugin插件来编译和打包Scala代码。3.检查是否配置了maven-assembly-plugin或maven-shade-plugin来生成包含依赖的fatjar(如果项目需要)。4.检查Flink相关依赖的版本是否一致。下面提供一个示例pom文件配置,重点展示如何配置scala-maven-plugin和maven-assembly-plugin(或maven-shade-plugin)来构建包含所有依赖的jar包。由于用户问题中提到了“FlinkJob模块”,我们假设这是一个多模块项目,所以还要检查模块之间的依赖关系。下面是一个可能的pom文件(针对FlinkJob模块)的配置示例,注意以下关键部分:步骤1:添加Scala语言支持(如果使用Scala)步骤2:添加必要的依赖(Flink核心依赖等)步骤3:配置打包插件(如scala-maven-plugin和maven-shade-plugin)假设我们使用maven-shade-plugin来创建包含所有依赖的fatjar。</think> 根据您描述的Flink Job模块依赖配置问题,结合引用信息分析,以下是排查步骤和解决方案: ### 一、常见POM配置问题分析 1. **Scala依赖打包缺失**(引用[5]问题) Maven默认不打包Scala类,需显式配置插件: ```xml <build> <plugins> <!-- Scala编译插件 --> <plugin> <groupId>net.alchim31.maven</groupId> <artifactId>scala-maven-plugin</artifactId> <version>3.2.0</version> <executions> <execution> <goals> <goal>compile</goal> <goal>testCompile</goal> </goals> </execution> </executions> </plugin> </plugins> </build> ``` 2. **Hadoop类路径缺失**(引用[4]问题) 缺少Hadoop依赖会导致文件系统访问异常: ```xml <dependencies> <!-- 必需Hadoop依赖 --> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-hdfs</artifactId> <version>${hadoop.version}</version> <!-- 需与Flink版本匹配 --> <scope>provided</scope> </dependency> </dependencies> ``` 3. **Flink核心依赖范围错误** 避免将核心依赖设为`provided`导致运行时缺失: ```xml <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-streaming-java</artifactId> <version>${flink.version}</version> <!-- 移除scope或设为compile --> </dependency> ``` ### 二、推荐解决方案 1. **完整POM配置示例**: ```xml <project> <properties> <flink.version>1.15.2</flink.version> <scala.version>2.12</scala.version> </properties> <dependencies> <!-- Flink核心依赖 --> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-core</artifactId> <version>${flink.version}</version> </dependency> <!-- Hadoop集成 --> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-hadoop-compatibility</artifactId> <version>${flink.version}</version> </dependency> </dependencies> <build> <plugins> <!-- Scala编译 --> <plugin> <groupId>net.alchim31.maven</groupId> <artifactId>scala-maven-plugin</artifactId> <version>3.2.0</version> <executions> <execution> <goals><goal>compile</goal></goals> </execution> </executions> </plugin> <!-- 生成包含依赖的Fat JAR --> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-shade-plugin</artifactId> <version>3.3.0</version> <executions> <execution> <phase>package</phase> <goals><goal>shade</goal></goals> <configuration> <transformers> <transformer implementation="..." /> </transformers> <createDependencyReducedPom>false</createDependencyReducedPom> </configuration> </execution> </executions> </plugin> </plugins> </build> </project> ``` ### 三、验证方法 1. **检查生成的JAR包**: ```bash # 解压JAR查看类文件 jar tvf target/your-job.jar | grep YourMainClass ``` 2. **本地测试依赖完整性**: ```java // 添加测试代码 public static void main(String[] args) { FileSystem.get(new Configuration()); // 验证HDFS依赖 } ``` ### 四、延伸问题解决 - **依赖冲突**:用`mvn dependency:tree`排查版本冲突 - **类加载问题**:在`flink-conf.yaml`中添加`classloader: parent-first`(引用[2]YARN配置相关)[^1]
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值