1 简介

2 部分代码
%clear all; clc; clf; warning off; close all hidden;totalt = 0; % Total time spent on segmentation.% PRE-PROCESS the image to produce a feature set.% 1. Texture processing using DOOG filters% 2. Principle component analysis to reduce dimensionality% 3. Random sampling of imageimg = im2double(imread('6.bmp')); % Read gray image%img = im2double(imread('girl.bmp')); % Read color imagedisp('Preprocessing...');tic;% Preprocess all[allfeatures, rDims, cDims, depth] = preprocfast(img);[samples,olddimensions] = size(allfeatures);gallfeatures = allfeatures;% Combine all texture features to use for later thresholding% Also save all low pass features for later adjacency processingif depth == 1texturefeature = max(allfeatures(:,4:11), [], 2);lowpassfeature = allfeatures(:,3);lowpassimage = reshape(lowpassfeature, [cDims rDims])';elsetexturefeature = max(allfeatures(:,6:13), [], 2);lowpassfeature = allfeatures(:,3:5);lowpassimage(:,:,1) = reshape(lowpassfeature(:,1), [cDims rDims])';lowpassimage(:,:,2) = reshape(lowpassfeature(:,2), [cDims rDims])';lowpassimage(:,:,3) = reshape(lowpassfeature(:,3), [cDims rDims])';endtextures = reshape(texturefeature, [cDims rDims])';% Principle component based dimensionality reduction of all featuresallfeatures = pca(allfeatures, 0.05);% Choose 10% of samples randomly and save in DATASET[samples, dimensions] = size(allfeatures);% We work on ~WORKSAMPLES pixels. If the image has less we use all pixels.% If not then the appropriate portion of pixels is randomly selected.worksamples = samples/10;if worksamples < 10000worksamples = 10000;endif samples < worksamplesworksamples = samples;endchoose = rand([samples 1]); choose = choose < (worksamples/samples);dataset = zeros([sum(choose), dimensions]);dataset(1:sum(choose),:) = allfeatures(find(choose),:); % find(choose) returns array where choose is non zerodisp('Preprocessing done.');t = toc; totalt = totalt + t;disp([' Original dimensions: ' int2str(olddimensions)]);disp([' Reduced dimensions by PCA: ' int2str(dimensions)]);disp([' Image has ' int2str(rDims * cDims) ' pixels.']);disp([' Using only ' int2str(size(dataset,1)) ' pixels.']);disp(['Elapsed time: ' num2str(t)]);disp(' ');% SEGMENTATION% 1. k-means (on sampled image)% 2. Use centroids to classify remaining points% 3. Classify spatially disconnected regions as separate regions% Segmentation Step 1.% k-means (on sampled image)% Compute k-means on randomly sampled pointsdisp('Computing k-means...');tic;% Set number of clusters heuristically.k = round((rDims*cDims)/(100*100)); k = max(k,8); k = min(k,16);% Uncomment this line when MATLAB k-means unavailable%[centroids,esq,map] = kmeanlbg(dataset,k);[map, centroids] = kmeans(dataset, k); % Calculate k-means (use MATLAB k-meandisp('k-means done.');t = toc; totalt = totalt + t;disp([' Number of clusters: ' int2str(k)]);disp(['Elapsed time: ' num2str(t)]);disp(' ');% Segmentation Step 2.% Use centroids to classify the remaining pointsdisp('Using centroids to classify all points...');tic;globsegimage = postproc(centroids, allfeatures, rDims, cDims); % Use centroids to classify all points% Segmentation Step 3.% Classify spatially disconnected regions as separate regionsglobsegimage = medfilt2(globsegimage, [3 3], 'symmetric');globsegimage = imfill(globsegimage);region_count = max(max(globsegimage));count = 1; newglobsegimage = zeros(size(globsegimage));for i = 1:region_countregion = (globsegimage == i);[bw, num] = bwlabel(region);for j = 1:numnewglobsegimage = newglobsegimage + count*(bw == j);count = count + 1;endendoldglobsegimage = globsegimage;globsegimage = newglobsegimage;disp('Classification done.');t = toc; totalt = totalt + t;disp(['Elapsed time: ' num2str(t)]);disp(' ');% DISPLAY IMAGES% Display segments%figure(1), imshow(globsegimage./max(max(globsegimage)));figure(1), imshow(label2rgb(globsegimage, 'gray'));title('Segments');% Calculate boundary of segmentsBW = edge(globsegimage,'sobel', 0);% Superimpose boundary on original imageiout = img;if (depth == 1) % Gray image, so use color linesiout(:,:,1) = iout;iout(:,:,2) = iout(:,:,1);iout(:,:,3) = iout(:,:,1);iout(:,:,2) = min(iout(:,:,2) + BW, 1.0);iout(:,:,3) = min(iout(:,:,3) + BW, 1.0);else % RGB image, so use white linesiout(:,:,1) = min(iout(:,:,1) + BW, 1.0);iout(:,:,2) = min(iout(:,:,2) + BW, 1.0);iout(:,:,3) = min(iout(:,:,3) + BW, 1.0);end% Display image and segmentsfigure(2), imshow(iout);title('Segmented image');% POST PROCESSING AND AUTOMATIC SELECTION OF SOURCE AND TARGET REGIONS% 1. Find overall textured region using Otsu's method (MATLAB graythresh)% 2. Save each region and region boundary separately and note index of% textured regions% 3. For each textured region, find all adjacent untextured regions and% save in adjacency matrix. Regions having a significant common border% are considered adjacent.% 4. Find similarity between textured and adjacent untextured regions% using average gray level matching (average color matching). For each% textured region, drop those adjacent regions which don't match in% gray level.disp('Post-processing and automatically selecting source and target regions...');tic;% POSTPROC Step 1threshold = graythresh(rescalegray(textures));bwtexture = textures > threshold;tex_edges = edge(bwtexture, 'sobel', 0);figure(3),if depth == 1imshow(min(img + tex_edges, 1));elseimshow(min(img + cat(3, tex_edges, tex_edges, tex_edges), 1));endtitle('Textured regions');% POSTPROC Step 2% Save each region in a dimension% Save each region boundary in a dimension% For each region which can be classified as a textured region store indexregion_count = max(max(globsegimage));number_tex_regions = 1; tex_list = [];for region_number = 1:region_countbwregion = (globsegimage == region_number);regions(:,:,region_number) = bwregion; % Save all regionsregion_boundaries(:,:,region_number) = edge(bwregion, 'sobel', 0);if ( (sum(sum(bwregion.*bwtexture))/sum(sum(bwregion)) > 0.75) && sum(sum(bwregion)) > (32*32) )tex_list = [tex_list region_number];number_tex_regions = number_tex_regions + 1;endendnumber_tex_regions = number_tex_regions - 1;% POSTPROC Step 3% Find texture region adjacency and create an adjacency matrixfor i = 1:size(tex_list, 2)for j = 1:region_countif (tex_list(i) ~= j)boundary_overlap = sum(sum( region_boundaries(:,:,tex_list(i)).*region_boundaries(:,:,j) ));boundary_total_length = sum(sum( region_boundaries(:,:,tex_list(i)))) + sum(sum(region_boundaries(:,:,j)));if (boundary_overlap/boundary_total_length > 0) % If overlap is at least 20% of total boundary lengthregion_adjacency(i,j) = boundary_overlap; % accept it as a boundaryendendendend% EXPERIMENTAL% Find adjacency matrix between all regions and segment the regions using% N-Cut.% for i = 1:region_count% region_feature(i,:) = get_region_features(regions(:,:,i), allfeatures);% end% for i = 1:region_count% for j = 1:region_count% W(i,j) =% END EXPERIMENTAL% Those regions for which the edge overlap length is less than 20% of the% mean overlap length are not considered adjacent. Update the adjacency% matrix to reflect this.region_adj_hard_coded = (region_adjacency - 0.2*repmat(mean(region_adjacency,2), [1 size(region_adjacency,2)])) > 0;% Copy adjacency into another variable and remove all references to% textured regions from the adjacency matrix.region_output = region_adj_hard_coded;for tex_count = 1:size(tex_list, 2)region_output(:,tex_list(tex_count)) = 0;end% POSTPROC Step 4% Find similarity between textured and adjacent untextured regions% (This could be changed to a chi-squared distance between histograms of% textLP and adjacent by commenting out required code, and uncommenting% other code, as directed in the source)% For all textured regions find and save average brightnessfor tex_count = 1:size(tex_list, 2)if depth == 1tex_avg_bright(tex_count) = sum(sum(regions(:,:,tex_list(tex_count)).*lowpassimage)) .../ sum(sum(regions(:,:,tex_list(tex_count))));% Comment previous and uncomment next line(s) to use histogram% processing%tex_hist{tex_count} = histproc(regions(:,:,tex_list(tex_count)), lowpassimage);elsetex_avg_bright(1,tex_count) = sum(sum(regions(:,:,tex_list(tex_count)).*lowpassimage(:,:,1))) .../ sum(sum(regions(:,:,tex_list(tex_count))));tex_avg_bright(2,tex_count) = sum(sum(regions(:,:,tex_list(tex_count)).*lowpassimage(:,:,2))) .../ sum(sum(regions(:,:,tex_list(tex_count))));tex_avg_bright(3,tex_count) = sum(sum(regions(:,:,tex_list(tex_count)).*lowpassimage(:,:,3))) .../ sum(sum(regions(:,:,tex_list(tex_count))));% Comment previous and uncomment next line(s) to use histogram% processing% tex_hist{tex_count} = histproc(regions(:,:,tex_list(tex_count)), lowpassimage);endend% For all textured regions, consider each non-textured region and update% adjacency matrix. Keep the relationship if gray levels (colors) are similar and% drop if the gray levels (colors) don't match.for tex_count = 1:size(tex_list, 2) % For all textured regionsfor adj_reg_count = 1:size(region_adj_hard_coded, 2)if (region_adj_hard_coded(tex_count, adj_reg_count) > 0)if depth == 1region_avg_bright = sum(sum(regions(:,:,adj_reg_count).*lowpassimage)) .../ sum(sum(regions(:,:,adj_reg_count)));% Comment previous and uncomment next line(s) to use histogram% processing% region_hist = histproc(regions(:,:,adj_reg_count), lowpassimage);elseregion_avg_bright(1) = sum(sum(regions(:,:,adj_reg_count).*lowpassimage(:,:,1))) .../ sum(sum(regions(:,:,adj_reg_count)));region_avg_bright(2) = sum(sum(regions(:,:,adj_reg_count).*lowpassimage(:,:,2))) .../ sum(sum(regions(:,:,adj_reg_count)));region_avg_bright(3) = sum(sum(regions(:,:,adj_reg_count).*lowpassimage(:,:,3))) .../ sum(sum(regions(:,:,adj_reg_count)));% Comment previous and uncomment next line(s) to use histogram% processing% region_hist = histproc(regions(:,:,adj_reg_count), lowpassimage);endif depth == 1if abs(tex_avg_bright(tex_count) - region_avg_bright) > 0.2 % Highly similarregion_output(tex_count, adj_reg_count) = 0;end% Comment previous and uncomment next line(s) to use histogram% processing% if chisq(tex_hist{tex_count}, region_hist) > 0.4% chisq(tex_hist{tex_count}, region_hist)% region_output(tex_count, adj_reg_count) = 0;% endelseif mean(abs(tex_avg_bright(:,tex_count) - region_avg_bright')) > 0.2region_output(tex_count, adj_reg_count) = 0;end% Comment previous and uncomment next line(s) to use histogram% processing% thist = tex_hist{tex_count};% chisq(thist(:,1),region_hist(:,1))% chisq(thist(:,2),region_hist(:,2))% chisq(thist(:,3),region_hist(:,3))% t = 0.9;% if (chisq(thist(:,1),region_hist(:,1)) > t) || ...% (chisq(thist(:,2),region_hist(:,2)) > t) || ...% (chisq(thist(:,3),region_hist(:,3)) > t)% region_output(tex_count, adj_reg_count) = 0;% endendendendenddisp('Post-processing done.'); t = toc; totalt = totalt + t;disp(['Elapsed time: ' num2str(t)]);disp(' ');disp(['Total time elapsed: ' int2str(floor(totalt/60)) ' minutes ' int2str(mod(totalt,60)) ' seconds.']);% DISPLAY IMAGES% Display source and target regions.if depth == 1imgs = zeros([rDims cDims size(tex_list,2)]);for tex_count = 1:size(tex_list, 2)if (sum(region_output(tex_count,:) > 0)) % If we have target patchesimgs(:,:,tex_count) = regions(:,:,tex_list(tex_count)).*img; % Save that source patchfor i = 1:size(region_output, 2) % For each regionif (region_output(tex_count, i) > 0) % which is connected to that source patchimgs(:,:,tex_count) = imgs(:,:,tex_count) + 0.5*regions(:,:,i).*img; % Save the target patchendendfigure, imshow(min(imgs(:,:,tex_count) + BW, 1));ggg{tex_count} = min(imgs(:,:,tex_count) + BW, 1);title('Potential source and target regions');endendelse % depth == 3count = 1;for tex_count = 1:size(tex_list, 2)if (sum(region_output(tex_count,:) > 0)) % If we have target patchestmp(:,:,1) = regions(:,:,tex_list(tex_count)).*img(:,:,1);tmp(:,:,2) = regions(:,:,tex_list(tex_count)).*img(:,:,2);tmp(:,:,3) = regions(:,:,tex_list(tex_count)).*img(:,:,3);imgs{count} = tmp;for i = 1:size(region_output, 2) % For each regionif (region_output(tex_count, i) > 0) % which is connected to that source patchtmp(:,:,1) = 0.5*regions(:,:,i).*img(:,:,1);tmp(:,:,2) = 0.5*regions(:,:,i).*img(:,:,2);tmp(:,:,3) = 0.5*regions(:,:,i).*img(:,:,3);imgs{count} = imgs{count} + tmp;endendfigure, imshow(min(imgs{count} + cat(3,BW,BW,BW), 1));ggg{count} = min(imgs{count} + cat(3,BW,BW,BW), 1);title('Potential source and target regions');count = count+1;endendend
3 仿真结果

4 参考文献
[1]马浩然. 基于纹理的图像分割方法研究[D]. 电子科技大学.
博主简介:擅长智能优化算法、神经网络预测、信号处理、元胞自动机、图像处理、路径规划、无人机等多种领域的Matlab仿真,相关matlab代码问题可私信交流。
部分理论引用网络文献,若有侵权联系博主删除。
本文介绍了使用DOOG滤波、PCA降维和随机采样预处理图像,然后通过k-means聚类和后续处理,实现图像的纹理特征分析、区域分类和相邻区域关系判断。最后,通过Otsu阈值法、颜色匹配和边缘检测确定源目标区域,旨在提供一种自动化的图像分割方法。
779

被折叠的 条评论
为什么被折叠?



