Start up of DSL

本文介绍了作者开始尝试使用VSTS和DSL的经历,并分享了相关的下载链接、支持网站及视频教程等资源。此外还推荐了一些DSM工具和相关知识的学习网站。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

VSTS和DSL说了很久了, 现在差不多是时候该试试拉...原来没试的主要一个原因也是因为过去一直都是用的Java :)

今天先下载了相关的安装程序和一些教程,有兴趣试试的兄弟可以直接重用俺Google的成果了,大概的资源如下:

下载:

http://www.microsoft.com/downloads/details.aspx?FamilyId=7E0FDD66-698A-4E6A-B373-BD0642847AB7&displaylang=en

支持网站:

http://forums.microsoft.com/msdn/showforum.aspx?forumid=61&siteid=1

视频教程:

http://www.msdnwebcast.com.cn/msdnsmartcast/CourseDetails.aspx?id=215

如果想看看其他DSM(Domain Specific Modeling)工具或者相关知识,可以看这个网站:

http://www.dsmforum.org/

附:俺Blog中过去和DSL相关的内容:

Keith Short关于VSTS的blog

DSL:下一个浪头?

MicroSoft Vs OMG :DSL Vs UML

MDA和DSL:生来陌路?

[读后感] 微软的MDA之路:Visual Studio 2005 Team System建模策略和FAQ

btw,今天好像是蒙克的生日,呵呵,看见Google的图了:

scala> // 导入必要的包 scala> import org.apache.spark.sql.SparkSession import org.apache.spark.sql.SparkSession scala> import org.elasticsearch.spark.sql._ import org.elasticsearch.spark.sql._ scala> scala> // 创建SparkSession实例 scala> val spark = SparkSession.builder() spark: org.apache.spark.sql.SparkSession.Builder = org.apache.spark.sql.SparkSession$Builder@71e5cd05 scala> .appName("ElasticsearchReadExample") res0: org.apache.spark.sql.SparkSession.Builder = org.apache.spark.sql.SparkSession$Builder@71e5cd05 scala> .getOrCreate() res1: org.apache.spark.sql.SparkSession = org.apache.spark.sql.SparkSession@61267fa2 scala> scala> // 查看spark变量的类型,确保是SparkSession scala> println(spark.getClass) class org.apache.spark.sql.SparkSession$Builder scala> scala> val defaultQuery: String = "?q=phone_no:5143217" defaultQuery: String = ?q=phone_no:5143217 scala> val esTable = "mediamatch_usermsg" esTable: String = mediamatch_usermsg scala> val options = Map( | ("es.nodes", "master"), | ("es.port", "9200"), | ("es.read.metadata", "false"), | ("es.mapping.date.rich", "false"), | ("es.net.http.auth.user", "elastic"), | ("es.net.http.auth.pass", "i55on9YR90t+r8z8-OSpi"), | ("es.nodes.wan.only", "true") | ) options: scala.collection.immutable.Map[String,String] = Map(es.nodes.wan.only -> true, es.net.http.auth.user -> elastic, es.net.http.auth.pass -> i55on9YR90t+r8z8-OSpi, es.mapping.date.rich -> false, es.port -> 9200, es.read.metadata -> false, es.nodes -> master) scala> scala> val esDf = spark.esDF(esTable, defaultQuery, options) <console>:30: error: value esDF is not a member of org.apache.spark.sql.SparkSession.Builder val esDf = spark.esDF(esTable, defaultQuery, options) ^ scala> esDf.select("phone_no", "owner_name", "owner_code", "run_name", "run_time").show() <console>:27: error: not found: value esDf esDf.select("phone_no", "owner_name", "owner_code", "run_name", "run_time").show() ^ scala> val df = spark.read.format("org.elasticsearch.spark.sql") <console>:27: error: value read is not a member of org.apache.spark.sql.SparkSession.Builder val df = spark.read.format("org.elasticsearch.spark.sql") ^ scala> .options(options) <console>:28: error: value options is not a member of scala.collection.immutable.Map[String,String] options .options(options) ^ scala> .load(s"$esTable/$defaultQuery") <console>:30: error: value load is not a member of scala.collection.immutable.Map[String,String] options .load(s"$esTable/$defaultQuery") ^ scala> scala> df.select("phone_no", "owner_name", "owner_code", "run_name", "run_time").show() <console>:27: error: not found: value df df.select("phone_no", "owner_name", "owner_code", "run_name", "run_time").show() ^ scala> import org.elasticsearch.spark.sql._ // 确保此行有效 import org.elasticsearch.spark.sql._ scala> implicit none => _root_.org.elasticsearch.spark.sql.EsSparkSQL.registerFunctions(spark.sqlContext) <console>:1: error: expected start of definition implicit none => _root_.org.elasticsearch.spark.sql.EsSparkSQL.registerFunctions(spark.sqlContext) ^ scala>
最新发布
03-26
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值