scala正则表达式获取url的host

本文介绍了一种使用正则表达式从复杂URL中提取主机名的方法。通过具体实例演示了如何准确匹配并获取目标信息。

小编今天要分享的是,通过正则表达式获取url中的host,
在实际开发中这是很常用的,大家收好了。
代码:

import java.util.regex.Pattern

/**
 * Created by Administrator on 2017/9/26.
 */
object UrlGeyHostTest {
  def main(args: Array[String]): Unit = {
    //传参
    val url1 = "http://tieba.baidu.com/p/4336698825"
    val url2 = "http://mp.weixin.qq.com/s?__biz=MzIyODgyNDk0OQ==&mid=2247483988&idx=3&sn=7181bbef257e27014051272d785eeafd&scene=4#wechat_redirect"
    var host = ""
    val p = Pattern.compile("(?<=//|)((\\w)+\\.)+\\w+")
    val matcher = p.matcher(url2)
    if (matcher.find()) {
      host = matcher.group()
    }
    println(host)
  }

}

ok
分享完毕

spark 02:58:40.53 INFO ==> spark 02:58:40.53 INFO ==> Welcome to the Bitnami spark container spark 02:58:40.53 INFO ==> Subscribe to project updates by watching https://github.com/bitnami/containers spark 02:58:40.54 INFO ==> Did you know there are enterprise versions of the Bitnami catalog? For enhanced secure software supply chain features, unlimited pulls from Docker, LTS support, or application customization, see Bitnami Premium or Tanzu Application Catalog. See https://www.arrow.com/globalecs/na/vendors/bitnami/ for more information. spark 02:58:40.54 INFO ==> spark 02:58:40.56 INFO ==> ** Starting Spark setup ** spark 02:58:40.60 INFO ==> Generating Spark configuration file... find: '/docker-entrypoint-initdb.d/': No such file or directory spark 02:58:40.61 INFO ==> No custom scripts in /docker-entrypoint-initdb.d spark 02:58:40.62 INFO ==> ** Spark setup finished! ** spark 02:58:40.65 INFO ==> ** Starting Spark in master mode ** starting org.apache.spark.deploy.master.Master, logging to /opt/bitnami/spark/logs/spark-spark-org.apache.spark.deploy.master.Master-1-de3b81ec6650.out Spark Command: /opt/bitnami/java/bin/java -cp /opt/bitnami/spark/conf/:/opt/bitnami/spark/jars/slf4j-api-2.0.16.jar:/opt/bitnami/spark/jars/* -Xmx1g org.apache.spark.deploy.master.Master --host de3b81ec6650 --port tcp://172.17.0.4:7077 --webui-port 8080 ======================================== Using Spark's default log4j profile: org/apache/spark/log4j2-defaults.properties 25/07/11 02:58:42 INFO Master: Started daemon with process name: 38@de3b81ec6650 25/07/11 02:58:42 INFO SignalUtils: Registering signal handler for TERM 25/07/11 02:58:42 INFO SignalUtils: Registering signal handler for HUP 25/07/11 02:58:42 INFO SignalUtils: Registering signal handler for INT 25/07/11 02:58:42 ERROR SparkUncaughtExceptionHandler: Uncaught exception in thread Thread[main,5,main] java.lang.NumberFormatException: For input string: "tcp://172.17.0.4:7077" at java.base/java.lang.NumberFormatException.forInputString(Unknown Source) at java.base/java.lang.Integer.parseInt(Unknown Source) at java.base/java.lang.Integer.parseInt(Unknown Source) at scala.collection.StringOps$.toInt$extension(StringOps.scala:910) at org.apache.spark.deploy.master.MasterArguments.<init>(MasterArguments.scala:40) at org.apache.spark.deploy.master.Master$.main(Master.scala:1412) at org.apache.spark.deploy.master.Master.main(Master.scala) 25/07/11 02:58:42 INFO ShutdownHookManager: Shutdown hook called
07-12
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值