Python debug 一

本文介绍了Python编程中两种常见的错误:意外的缩进和无效语法。遇到unexpected indent时,需要检查代码的缩进格式;而invalid syntax提示则意味着代码存在语法错误,需仔细核查语法规范。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

1、unexpected indent

Python对代码格式是很严格的,indentation是缩进、缩排的意思,unexpected indent 就是说有“意外的”缩进。这时,就要查看自己的代码格式了!

2、invalid syntax

invalid  adj.无效的;不能成立的;

syntax  n.语法;句法;句法规则[分析];语构

invalid syntax就是无效语法。这时候,要检查自己的代码语法。

 

嘻嘻然后记入门学习NPL的今天的代码(还没搞懂代码的意思……)!

# -*- coding: utf-8 -*-
"""
Spyder Editor

This is a temporary script file.
"""

 # -*- coding: utf-8 -*-  
"""  
Created on Mon Aug 15 21:00:27 2016  
@author: amnesia  
"""  
	
from sklearn.datasets import load_iris  
iris = load_iris()  
	
# The feature (column) names and the response  
print(iris.feature_names)
print(iris.target)
print(iris.target_names)
	
# The object types of the feature matrix and the response array  
print(type(iris.data))
print(type(iris.target))
	
# The shapes of samples and features  
print(iris.data.shape)

# -*- coding: utf-8 -*-
"""
Created on Tue Aug 16 19:29:30 2016

@author: amnesia

"""
from sklearn.datasets import load_iris
# Import LinearSVC class
from sklearn.svm import LinearSVC
# Import KNeighborsClassifier class
from sklearn.neighbors import KNeighborsClassifier
# Load the dataset
iris = load_iris()
# Assign to variables for more convenient handling
X = iris.data
y = iris.target
# Create an instance of the LinearSVC classifier
clf = LinearSVC()
# Train the model
clf.fit(X, y)
# Get the accuracy score of the LinearSVC classifier
print (clf.score(X, y))
# Predict the response given a new observation
print (clf.predict([[ 6.3,  3.3,  6.0,  2.5]]))
# Create an instance of KNeighborsClassifier
# The default number of K neighbors is 5.
# This can be changed by passing n_neighbors=k as argument
knnDefault = KNeighborsClassifier() # K = 5
# Train the model
knnDefault.fit(X, y)
# Get the accuracy score of KNeighborsClassifier with K = 5
print (knnDefault.score(X, y))
# Predict the response given a new observation
print (knnDefault.predict([[ 6.3,  3.3,  6.0,  2.5]]))
# Let's try a different number of neighbors
knnBest = KNeighborsClassifier(n_neighbors=10) # K = 10
# Train the model
knnBest.fit(X, y)
# Get the accuracy score of KNeighborsClassifier with K = 10
print (knnBest.score(X, y))
# Predict the response given a new observation
print (knnBest.predict([[ 6.3,  3.3,  6.0,  2.5]]))
# Let's try a different number of neighbors
knnWorst = KNeighborsClassifier(n_neighbors=100) # K = 100
# Train the model
knnWorst.fit(X, y)
# Get the accuracy score of KNeighborsClassifier with K = 100
print (knnWorst.score(X, y))
# Predict the response given a new observation
print (knnWorst.predict([[ 6.3,  3.3,  6.0,  2.5]]))

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值