由Stream.Position问题而引发的思考

本文探讨了Web开发中图片上传功能的实现及问题排查,详细分析了一种图片上传后无法正常显示的问题及其解决方案。
Web开发中的文件上传功能一直以来都是一个比较棘手的问题,特别是在开发Ajax网站时,面对功能复杂的页面元素,实现页面无刷新的文件上传功能就变得更加复杂。我在 http://www.cnblogs.com/jaxu/archive/2009/05/19/1459796.html 一文中介绍过如何通过页面中隐藏的iFrame提交表单达到文件的上传,而且不刷新当页中的Form,模拟页面的无刷新文件上传功能。而且一般情况下,我们在服务端会这样来处理要上传的文件。
复制代码
private  List < ImageEntity >  GetUploadImages()
{
    List
< ImageEntity >  images  =   new  List < ImageEntity > ();
    HttpFileCollection files 
=  Request.Files;
    
int  fileLen;
    ImageEntity image 
=   null ;

    
if  (files  !=   null   &&  files.Count  >   0 )
    {
        
for  ( int  i  =   0 ; i  <  files.Count; i ++ )
        {
            
if  (files[i].FileName.Length  !=   0 )
            {
                
try
                {
                    fileLen 
=  files[i].ContentLength;
                    Byte[] bytes 
=   new  Byte[fileLen];
                    stream.Read(bytes, 
0 , fileLen);

                    image 
=   new  ImageEntity();
                    image.Type 
=  files[i].ContentType;
                    image.ImageBlob 
=  bytes;
                    image.Title 
=  Path.GetFileName(files[i].FileName);
                    images.Add(image);
                }
                
catch  { }
            }
        }
    }

    
return  images;
}
复制代码
    这是一个得到客户端上传的图片文件的示例代码,其中的ImageEntity是一个Image的实体类,我们可以通过该实体类描述的信息将图片存储在数据库中(或者存储在服务器磁盘上)。
复制代码
public   class  ImageEntity
{
    
public  ImageEntity()
    {
    }

    
public  ImageEntity( string  title, Byte[] imageBlob,  string  type)
    {
        Title 
=  title;
        ImageBlob 
=  imageBlob;
        Type 
=  type;
    }

    
public   string  Title {  get set ; }
    
public   string  Type {  get set ; }
    
public  Byte[] ImageBlob {  get set ; }
}
复制代码

    这么使用是没有问题的!当我们上传图片时,页面通过Post方法提交到服务端,服务端构造一个和图片字节大小相同的byte数组,通过HttpPostedFile的InputStream属性得到一个System.IO.Stream对象,然后使用该对象的Read方法将图片数据流读到byte数组中。这个过程页面是需要回传的,并且下一次用户上传图片时Stream对象会被重新构造,然后重新填充byte数组。

    不过在一次MOSS开发中我偶然地发现使用该方式上传图片时,保存图片到数据库没有出现问题,当从数据库中读取图片时却显示了一个红色的叉,表示图片不可用或加载失败。仔细查看了一下数据库中的数据,除了保存图片二进制数据的字段显示为<Binary data>外,其余字段的数据都很正常,没有发现什么异常情况,调试了一下程序,图片上传和保存的过程中并没有抛出任何异常,一切都很顺利,但就是在读取图片的时候页面上无法正常加载图片。

    一开始我就感觉这个问题很奇怪,看来问题应该是出在上传图片时图片的二进制数据获取得不正确或不完整,我再次调试跟踪了一下代码,发现在上传的过程中byte数组中的值始终都是0,即便是在通过Stream.Read方法填充数据之后也是如此。这到底是为什么?难道我用错对象了?查了一下MSDN,发现上面给出的示例基本上也是通过这种方法得到要上传的文件并通过Stream.Read方法填充byte数组的,MSDN的示例代码肯定是不会有错误的,那错误到底在哪里呢?我开始迷惑了...

    仔细搜了搜Google,其中有一位朋友给出的帮助对我很有用,他建议在使用Stream.Read方法填充byte数组前先判断一下Stream.Position的值是否为0,如果不为0就先将它置为0,然后再进行byte数组的填充。不过我先是想到每次页面在上传文件时所构造的Stream对象都是新的(因为页面会PostBack回来),我并没有在程序的任何地方缓存Stream对象,既然Stream对象是新的,那么Position属性的值肯定就是0啦。我抱着半信半疑的心态试了试这位朋友介绍的方法,果然奏效了,看来Stream对象真的是在某个地方被缓存了,或者说在前一次文件上传之后Stream对象没有被完全释放。于是我修改了上面的那个得到上传图片的方法。

复制代码
private  List < ImageEntity >  GetUploadImages()
{
    List
< ImageEntity >  images  =   new  List < ImageEntity > ();
    HttpFileCollection files 
=  Request.Files;
    
int  fileLen;
    ImageEntity image 
=   null ;

    
if  (files  !=   null   &&  files.Count  >   0 )
    {
        
for  ( int  i  =   0 ; i  <  files.Count; i ++ )
        {
            
if  (files[i].FileName.Length  !=   0 )
            {
                
try
                {
                    fileLen 
=  files[i].ContentLength;
                    Byte[] bytes 
=   new  Byte[fileLen];
                    
using  (Stream stream  =  files[i].InputStream)
                    {
                        stream.Position 
=   0 ;
                        stream.Read(bytes, 
0 , fileLen);
                    }

                    image 
=   new  ImageEntity();
                    image.Type 
=  files[i].ContentType;
                    image.ImageBlob 
=  bytes;
                    image.Title 
=  Path.GetFileName(files[i].FileName);
                    images.Add(image);
                }
                
catch  { }
            }
        }
    }

    
return  images;
}
复制代码

    因为我是在MOSS平台上开发的,与普通的ASP.NET项目就有很多的不同,有可能会受到很多MOSS本身的东西影响,例如缓存机制等。但不管怎样,涉及到像Stream这样的对象,在使用之后最好都让它立即释放,因为即使Stream不被缓存,它也有可能长时间占用内存而消耗掉很多的服务器资源。建议在Using语句中使用Stream,并且在使用Read方法填充byte数组前重置Position为0,这样可以确保byte数组被正确填充,从而保证可以得到正确的文件数据。


本文转自Jaxu博客园博客,原文链接:http://www.cnblogs.com/jaxu/archive/2009/05/19/1460250.html,如需转载请自行联系原作者


using AForge.Video.DirectShow; using BaseLibrary; using Configs; using Google.Protobuf.Compiler; using LogBusLibrary; using LogBusLibrary.Extensions; using Luxand; using NetTopologySuite.GeometriesGraph; using SelfMeet.FreeSqlExt; using SelfMeet.LawLibrarys; using System; using System.Collections.Generic; using System.Collections.ObjectModel; using System.ComponentModel; using System.Diagnostics; using System.Diagnostics.Eventing.Reader; using System.Drawing; using System.Drawing.Imaging; using System.IO; using System.Linq; using System.Runtime.CompilerServices; using System.Runtime.ConstrainedExecution; using System.Text; using System.Threading; using System.Threading.Tasks; using System.Windows; using System.Windows.Input; using System.Windows.Threading; using ViewDtos; using static Luxand.FSDK; using MessageBox = HandyControl.Controls.MessageBox; using Rectangle = System.Drawing.Rectangle; namespace SelfMeet.Cameras { /// <summary> /// FaceValidate.xaml 的交互逻辑 /// </summary> public partial class FaceValidate : Window, INotifyPropertyChanged, IDisposable { ILogger Logger = LogManager.GetCurrentClassLogger(); ZiZhuSetting Config = AppConfig.Instance.ZiZhuSetting; private MicSpeech speech = MicSpeech.Instance; public bool IsInitFaceSDK { get; private set; } private System.Drawing.Pen GreenPen = new System.Drawing.Pen(System.Drawing.Color.Green, 2); public VideoCaptureDevice CurVideoSource { get; private set; } public event PropertyChangedEventHandler PropertyChanged; private float threshold_person = 0.05f; //比对阈值 private int ConfidenceGender = 1; private int macth_i = 0; //比对次数 private float ConfidenceSmile = 0.0f; private float ConfidenceEyesOpen = 0.0f; private float ConfidenceAge = 0.0f; private bool Face_match = false; //是否比对成功 作为大循环使用 private bool _disposed = false; DateTime Face_start = DateTime.Now;//定义一个开始比对的时间,主要是超时认为人脸比对失败 private bool Face_time_out = false; //人脸比对是否超时 private bool Face_abort = false; //是否强制退出 private int NewFrame_i = 0; //间隔抓拍帧 protected bool SetValue<T>(ref T oldValue, T newValue, [CallerMemberName] string propertyName = null) { bool isChanged = false; if (object.Equals(oldValue, newValue) == false) { isChanged = true; oldValue = newValue; if (string.IsNullOrEmpty(propertyName) == false) { PropertyChanged?.Invoke(this, new PropertyChangedEventArgs(propertyName)); } } return isChanged; } private void SaveConfig() { Config.FaceCamera = SLDCamera.Index; Config.FaceCamera_Width = SLDPixel.Width; Config.FaceCamera_Height = SLDPixel.Height; Config.FaceCamera_Exposure = ValueExposure; Config.FaceCamera_RotateFlipType = SLDCameraFlip; AppConfig.Instance.SaveConfig(); } #region 摄像头设置 private int _MinExposure = -10; private int _MaxExposure = 10; private int _ValueExposure = AppConfig.Instance.ZiZhuSetting.FaceCamera_Exposure; private int _SetpExposure = 1; private CameraFlipType _SLDCameraFlip = AppConfig.Instance.ZiZhuSetting.FaceCamera_RotateFlipType; /// <summary> /// 是否初始化设置界面 /// </summary> public ObservableCollection<CameraFlipType> CameraFlipList { get; set; } = EnumToObservableCollection<CameraFlipType>(); public CameraFlipType SLDCameraFlip { get { return _SLDCameraFlip; } set { if (value != _SLDCameraFlip) { _SLDCameraFlip = value; PropertyChanged?.Invoke(this, new PropertyChangedEventArgs("SLDCameraFlip")); Task.Run(() => { SaveConfig(); }); } } } private static ObservableCollection<T> EnumToObservableCollection<T>() where T : Enum { var result = new ObservableCollection<T>(); foreach (T item in Enum.GetValues(typeof(T))) { result.Add(item); } return result; } public ObservableCollection<CameraDevice> CameraList { get; set; } = new ObservableCollection<CameraDevice>(); public ObservableCollection<CameraPixel> PixelList { get; set; } = new ObservableCollection<CameraPixel>(); private CameraPixel _SLDPixel = null; public CameraPixel SLDPixel { get { return _SLDPixel; } set { if (value != _SLDPixel) { _SLDPixel = value; PropertyChanged?.Invoke(this, new PropertyChangedEventArgs("SLDPixel")); if (_SLDPixel != null) { Task.Run(() => { ReStartCamera("修改清析度"); SaveConfig(); }); } } } } private CameraDevice _SLDCamera = null; public CameraDevice SLDCamera { get { return _SLDCamera; } set { if (value != _SLDCamera) { _SLDCamera = value; PropertyChanged?.Invoke(this, new PropertyChangedEventArgs("SLDCamera")); if (_SLDCamera != null) { BindCapabilities(); } } } } private void BindCapabilities() { PixelList.Clear(); int index = 0; VideoCaptureDevice captureDevice = new VideoCaptureDevice(_SLDCamera.MonikerString); foreach (var item in captureDevice.VideoCapabilities.OrderByDescending(x => x.FrameRate).ThenBy(x => x.FrameSize.Width)) { CameraPixel pixel = new CameraPixel(); pixel.Index = index; pixel.Width = item.FrameSize.Width; pixel.Height = item.FrameSize.Height; pixel.Name = $"Size:{item.FrameSize.Width}×{item.FrameSize.Height};Rate:{item.FrameRate}"; pixel.videoCapabilities = item; PixelList.Add(pixel); index++; } int _Min, _Max, _Step, _DefValue = 0; CameraControlFlags _Flags = CameraControlFlags.None; if (captureDevice.GetCameraPropertyRange(CameraControlProperty.Exposure, out _Min, out _Max, out _Step, out _DefValue, out _Flags)) { MinExposure = _Min; MaxExposure = _Max; SetpExposure = _Step; } if (Config.FaceCamera_Exposure >= MinExposure && Config.FaceCamera_Exposure <= MaxExposure) { ValueExposure = Config.FaceCamera_Exposure; } else if (captureDevice.GetCameraProperty(CameraControlProperty.Exposure, out _DefValue, out _Flags)) { ValueExposure = _DefValue; } captureDevice = null; CameraPixel _pixel = PixelList.FirstOrDefault(x => x.Width == Config.FaceCamera_Width && x.Height == Config.FaceCamera_Height); if (_pixel == null) { _pixel = PixelList.FirstOrDefault(); } if (_pixel != null) { SLDPixel = _pixel; } } private void Exposure_Sub(object sender, RoutedEventArgs e) { if (ValueExposure > MinExposure) { ValueExposure--; } } private void Exposure_Add(object sender, RoutedEventArgs e) { if (ValueExposure < MaxExposure) { ValueExposure++; } } public int ValueExposure { get { return _ValueExposure; } set { if (value != _ValueExposure && value <= MaxExposure && value >= MinExposure) { _ValueExposure = value; PropertyChanged?.Invoke(this, new PropertyChangedEventArgs("ValueExposure")); if (CurVideoSource != null) { CurVideoSource.SetCameraProperty(CameraControlProperty.Exposure, _ValueExposure, CameraControlFlags.Manual); Task.Run(() => { speech.SpeakAsync("正在调整设置"); SaveConfig(); }); } } } } public int MinExposure { get => _MinExposure; set { if (value != _MinExposure) { _MinExposure = value; // CurVideoSource.SetCameraProperty(CameraControlProperty.Exposure, ValueExposure, CameraControlFlags.Manual); PropertyChanged?.Invoke(this, new PropertyChangedEventArgs("MinExposure")); } } } public int MaxExposure { get => _MaxExposure; set { if (value != _MaxExposure) { _MaxExposure = value; // CurVideoSource.SetCameraProperty(CameraControlProperty.Exposure, ValueExposure, CameraControlFlags.Manual); PropertyChanged?.Invoke(this, new PropertyChangedEventArgs("MaxExposure")); } } } public int SetpExposure { get => _SetpExposure; set { if (value != _SetpExposure) { _SetpExposure = value; // CurVideoSource.SetCameraProperty(CameraControlProperty.Exposure, ValueExposure, CameraControlFlags.Manual); PropertyChanged?.Invoke(this, new PropertyChangedEventArgs("SetpExposure")); } } } public void InitializeSettingView() { FilterInfoCollection CameraDevices = new FilterInfoCollection(FilterCategory.VideoInputDevice); for (int i = 0; i < CameraDevices.Count; i++) { CameraDevice item = new CameraDevice() { Index = i, Name = CameraDevices[i].Name, MonikerString = CameraDevices[i].MonikerString }; CameraList.Add(item); } CameraDevice device = CameraList.FirstOrDefault(x => x.Index == Config.FaceCamera); if (device == null) { device = CameraList.FirstOrDefault(); } if (device != null) { SLDCamera = device; } CameraDevices.FreeMemory(); CameraDevices = null; device.FreeMemory(); device = null; } #endregion public FaceValidate(Dto_Person person) { InitializeComponent(); InitializeSettingView(); this.Person = person; this.Person.IsCheck = false; this.DataContext = this; this.Loaded += FaceValidate_Loaded; this.Closing += FaceValidate_Closing; } /// <summary> /// 是否打开相机 /// </summary> private bool IsOpenCamera = false; public String ErrorMsg { get; set; } = String.Empty; private void FaceValidate_Loaded(object sender, RoutedEventArgs e) { this.PressBar.ToolTip = "人脸核验"; this.DataContext = this; /*Task.Run(() => { }).ContinueWith((t) => { this.Dispatcher.Invoke(() => { this.PressBar.Visibility = Visibility.Collapsed; if (t.IsFaulted) { MessageBox.Show(t.Exception.InnerException.Message, "系统提示", MessageBoxButton.OK, MessageBoxImage.Error); this.Close(); } }); });*/ System.Threading.Thread.Sleep(10); if (InitFacseSDK() == false) { throw new Exception("人脸算法初始化失败!"); } if (OrgFaceTemp == null) { GetPersonFaceTemp(); } StartCamera(); //启动一个timer,主要作用是监测是否有人脸成功的值变化或者超过了多少时间,要关闭这个窗口,也是为了防止出现关闭摄像头死锁情况 timer = new DispatcherTimer(); timer.Interval = TimeSpan.FromMilliseconds(500); // 1秒间隔 timer.Tick += Timer_Tick; // 绑定事件处理程序 timer.Start(); // 启动计时器 Face_start = DateTime.Now; //赋值开始比对时间 Face_time_out = false; //人脸比对是否超时 Face_abort = false; //是否强制退出人脸比对 // 转换 FaceThreshold 为 threshold_person threshold_person = (100 - Config.FaceThreshold) / 100.0f; } private void GetPersonFaceTemp() { if (Person.FaceImage == null) { ErrorMsg = "照片为空"; throw new Exception("人员照片为空"); } else { try { using (System.IO.MemoryStream ms = new System.IO.MemoryStream()) { ms.Write(Person.FaceImage, 0, Person.FaceImage.Length); System.Drawing.Image face = System.Drawing.Image.FromStream(ms); FSDK.CImage Image = new FSDK.CImage(face); OrgFaceTemp = Image.GetFaceTemplate(); Image.Dispose(); Image = null; ms.Dispose(); ms.FreeMemory(); } } catch (Exception ex) { ErrorMsg = "人脸特征"; throw new Exception("人脸特征提取失败!\n请修改人员信息中重新拍照"); } } } private Dto_Person _Person = null; public Dto_Person Person { get => _Person; set => SetValue(ref _Person, value); } private System.Drawing.Image _CameraImage = null; public System.Drawing.Image CameraImage { get => _CameraImage; set => SetValue(ref _CameraImage, value); } private System.Drawing.Image _FaceImage = null; private byte[] OrgFaceTemp = null; public System.Drawing.Image FaceImage { get => _FaceImage; set => SetValue(ref _FaceImage, value); } private string _MatchValue = "人脸比对值"; public string MatchValue { get => _MatchValue; set => SetValue(ref _MatchValue, value); } private ReaderWriterLockSlim ReaderWriterLock = new ReaderWriterLockSlim(); private List<float> MatchList = new List<float>(); private DispatcherTimer timer; /// <summary> /// 初始化FSDK 因为FSDK是一个静态扩展 SetFaceDetectionThreshold 成功即已经初始化过 重复加载会引起内存溢出 /// </summary> /// <returns></returns> ///<remarks> /// 1、因为FSDK是一个C++静态扩展 SetFaceDetectionThreshold 成功即已经初始化过 /// 2、每次初始化会重新加载人脸模型 大约200M内存 会引起内存溢出 ///</remarks> private bool InitFacseSDK() { int result = FSDK.SetFaceDetectionThreshold(15);//设置人检测阀值 最小值 1 默认值5 if (result == FSDK.FSDKE_NOT_ACTIVATED) { result = FSDK.ActivateLibrary("fVrFCzYC5wOtEVspKM/zfLWVcSIZA4RNqx74s+QngdvRiCC7z7MHlSf2w3+OUyAZkTFeD4kSpfVPcRVIqAKWUZzJG975b/P4HNNzpl11edXGIyGrTO/DImoZksDSRs6wktvgr8lnNCB5IukIPV5j/jBKlgL5aqiwSfyCR8UdC9s=d"); if (result == FSDK.FSDKE_OK) { result = FSDK.InitializeLibrary(); } } IsInitFaceSDK = result == FSDK.FSDKE_OK; return IsInitFaceSDK; } private void StartCamera() { if (CurVideoSource == null) { CurVideoSource = new VideoCaptureDevice(_SLDCamera.MonikerString); CurVideoSource.NewFrame += CurVideoSource_NewFrame; if (_SLDPixel != null) { CurVideoSource.VideoResolution = CurVideoSource.VideoCapabilities[_SLDPixel.Index]; CutRectangle = GetCutRectangle(CurVideoSource.VideoCapabilities[_SLDPixel.Index]); } else { CurVideoSource.VideoResolution = CurVideoSource.VideoCapabilities[0]; CutRectangle = GetCutRectangle(CurVideoSource.VideoCapabilities[0]); } if (CurVideoSource.IsRunning) { CurVideoSource.SignalToStop(); } CurVideoSource.Start(); } } private void SafeStopCamera() { try { if (CurVideoSource != null && CurVideoSource.IsRunning) { CurVideoSource.NewFrame -= CurVideoSource_NewFrame; try { CurVideoSource.SignalToStop(); // 添加超时机制 var stopTimeout = TimeSpan.FromSeconds(3); var stopwatch = Stopwatch.StartNew(); while (CurVideoSource.IsRunning && stopwatch.Elapsed < stopTimeout) { Thread.Sleep(50); } if (CurVideoSource.IsRunning) { // 强制停止 CurVideoSource.Stop(); } /*CurVideoSource.SignalToStop(); CurVideoSource.WaitForStop(); CurVideoSource.Stop();*/ //CurVideoSource.Dispose(); } catch (Exception ex) { // 记录日志或处理异常 Debug.WriteLine($"停止摄像头时出错: {ex.Message}"); } finally { CurVideoSource = null; } } } catch (Exception ex) { Debug.WriteLine($"安全停止摄像头时出错: {ex.Message}"); } } private void ReStartCamera(string reMsg) { if (CurVideoSource != null) { speech.SpeakAsync("正在调整设置"); //StopCamera(); SafeStopCamera(); StartCamera(); } else { this.Debug("重启摄像头:尚未启动过摄像头"); } } private void FaceValidate_Closing(object sender, CancelEventArgs e) { //StopCamera(); //SafeStopCamera(); //this.DialogResult = Person.IsCheck; //StopCamera(); // this.DialogResult = Person.IsCheck; // Face_time_out = true; // SafeStopCamera(); } private Rectangle GetCutRectangle(VideoCapabilities videoCapa) { int width = videoCapa.FrameSize.Width; int height = videoCapa.FrameSize.Height; int c_width = width; int c_height = height; int c_x = 0; int c_y = 0; if (height * 1.0F / width < 0.75f) { c_width = (int)(height / 0.75); c_x = (width - c_width) / 2; } else if (height * 1.0F / width > 0.75f) { c_height = (int)(width * 0.75); c_y = (height - c_height) / 2; } return new Rectangle(c_x, c_y, c_width, c_height); } private Rectangle CutRectangle = new Rectangle(0, 0, 800, 600); /// <summary> /// 摄相头数据到达时 /// </summary> /// <param name="sender"></param> /// <param name="eventArgs"></param> private void CurVideoSource_NewFrame(object sender, AForge.Video.NewFrameEventArgs eventArgs) { // 添加早期退出检查 if ( _disposed || Face_time_out || Face_abort || Face_match) { eventArgs.Frame.Dispose(); return; } if (NewFrame_i >= 2) { System.Diagnostics.Stopwatch stopwatch = System.Diagnostics.Stopwatch.StartNew(); float MatchValue = 0.0f; bool Match = false; Bitmap bitimage = null; System.Drawing.Image camImage = null; FSDK.CImage FCimge = null; FSDK.CImage Face = null; try { // 使用using确保eventArgs.Frame被正确释放 TimeSpan difference = DateTime.Now - Face_start; if (difference.TotalSeconds < 15) { using (Bitmap sourceFrame = (Bitmap)eventArgs.Frame.Clone()) { bitimage = sourceFrame.Clone(CutRectangle, System.Drawing.Imaging.PixelFormat.Undefined); } // 克隆照片 如果分卷率大于800*600 则按比例裁剪 bitimage.RotateFlip(((RotateFlipType)SLDCameraFlip)); camImage = bitimage; // 现在camImage是我们要处理的图像 FCimge = new FSDK.CImage(camImage); FSDK.TFacePosition[] facePositions; int faceCount = 0; //同时取三张人脸 int result = FSDK.DetectMultipleFaces(FCimge.ImageHandle, ref faceCount, out facePositions, FSDK.sizeofTFacePosition * 3); //检测到人脸时 if (result == FSDK.FSDKE_OK) { if (facePositions != null) { // 如需降序排序 按照获得人脸大小进行排序 facePositions = facePositions.OrderByDescending(p => p.xc).ToArray(); // facePositions = facePositions.OrderByDescending(x => x.w).ToArray(); for (int i = 0; i < facePositions.Length; ++i) { // FSDK.TFacePosition position = facePositions.OrderByDescending(x => x.w).First(); byte[] FaceTemplate = null; if (FSDK.GetFaceTemplateInRegion(FCimge.ImageHandle, ref facePositions[i], out FaceTemplate) == FSDK.FSDKE_OK) { int Start_X = (int)Math.Floor(facePositions[i].xc - facePositions[i].w * 0.7f); //人脸起始位置—X int Start_Y = (int)Math.Floor(facePositions[i].yc - facePositions[i].w * 0.85f); //人脸起始位置—Y int CutWidth = (int)Math.Ceiling(facePositions[i].w * 1.4f); //人脸宽度 int CutHeight = (int)Math.Ceiling(facePositions[i].w * 1.7f);//人脸高度 Rectangle faceRectangle = new Rectangle(Start_X, Start_Y, CutWidth, CutHeight); Face = FCimge.CopyRect(faceRectangle.X, faceRectangle.Y, faceRectangle.X + faceRectangle.Width, faceRectangle.Y + faceRectangle.Height);//这儿采用FSDK自带功能裁剪出人员照片 using (Graphics gr = Graphics.FromImage(camImage))//绘制人脸识别框 { gr.DrawRectangle(GreenPen, faceRectangle); } // 创建FaceImage的副本,确保我们可以安全地释放FSDK资源 using (var tempFaceImage = Face.ToCLRImage()) { //FaceImage = (System.Drawing.Image)tempFaceImage.Clone(); Application.Current.Dispatcher.Invoke(() => { FaceImage = (System.Drawing.Image)tempFaceImage.Clone(); }); } float threshold = 0.0f; //faceImg1 = templateData; FSDK.GetMatchingThresholdAtFAR(threshold_person, ref threshold); FSDK.MatchFaces(ref OrgFaceTemp, ref FaceTemplate, ref MatchValue);///与人员照片特征对比 Console.WriteLine("比对MatchValueA:" + MatchValue.ToString("0.00")); Console.WriteLine("比对thresholdA:" + threshold.ToString("0.00")); //比对成功,根据配置的阈值来比对 //if (MatchValue > Config.FaceThreshold) if (MatchValue > threshold) { //读取年纪表情眼睛等特征值 FSDK.TFacePosition facePosition = facePositions[i]; FSDK.TPoint[] facialFeatures = FCimge.DetectFacialFeaturesInRegion(ref facePosition); String AttributeValues; if (0 == FSDK.DetectFacialAttributeUsingFeatures(FCimge.ImageHandle, ref facialFeatures, "Expression;Age;Gender", out AttributeValues, 1024)) { FSDK.GetValueConfidence(AttributeValues, "Age", ref ConfidenceAge); FSDK.GetValueConfidence(AttributeValues, "Smile", ref ConfidenceSmile); FSDK.GetValueConfidence(AttributeValues, "EyesOpen", ref ConfidenceEyesOpen); float ConfidenceMale = 0.0f; float ConfidenceFemale = 0.0f; FSDK.GetValueConfidence(AttributeValues, "Male", ref ConfidenceMale); FSDK.GetValueConfidence(AttributeValues, "Female", ref ConfidenceFemale); if (ConfidenceMale > ConfidenceFemale) { ConfidenceGender = 1; } else { ConfidenceGender = 0; } } facePosition = null; facialFeatures = null; //每次比对成功要把照片存入当时抓拍的数据库 var SelfMeeting_TB_FaceList_log = new SelfMeeting_TB_FaceList_log() { MeetPersonPID = Person.MeetPersonNumber, CreateTime = System.DateTime.Now, FaceImage = ImageToByteArray(FaceImage), Template = FaceTemplate, Smile = (int)(ConfidenceSmile * 100), EyesOpen = (int)(ConfidenceEyesOpen * 100), Age = (int)(ConfidenceAge), Gender = ConfidenceGender, similarity = Math.Round((double)MatchValue, 2), threshold = Math.Round((double)threshold, 2), FacePositionW = facePositions[i].w, FacePositionAngle = (int)facePositions[i].angle //id_updatetime = System.DateTime.Now }; // 调用扩展方法插入单条记录 decimal insertedId = App.MainDB.CreateSelfMeeting_TB_FaceList_log(SelfMeeting_TB_FaceList_log); //返回当时的比对照片记录ID给Person Person.FaceLogID = insertedId.ToString(); SelfMeeting_TB_FaceList_log.FreeMemory(); SelfMeeting_TB_FaceList_log = null; //解绑事件的目的是这里已经处理完了,获得了人脸验证结果了 CurVideoSource.NewFrame -= CurVideoSource_NewFrame; if (Config.LawEnable == true && Person.HandlersType == "4") { MicSpeech.Instance.SpeakAsync("开始律师身份验证"); this.Dispatcher.Invoke(() => { using (LawCheck lawchek = new LawCheck(Person, (MatchValue * 100).ToString("0.00"), this.FaceImage)) { if (lawchek.ShowDialog() == true) { MicSpeech.Instance.SpeakAsync("律师身份验证成功"); Person.IsCheck = true; Face_match = true; } else { MicSpeech.Instance.SpeakAsync(lawchek.Message); Person.IsCheck = false; Face_time_out = true; Face_abort = true; } } //lawchek = null; }); } else { MicSpeech.Instance.SpeakAsync("人脸认证成功"); Match = true; // 移除事件处理器 // CurVideoSource.NewFrame -= CurVideoSource_NewFrame; Person.IsCheck = true; Face_match = true; /* 释放当前帧资源(重要!) eventArgs.Frame.Dispose();*/ } break; } FaceTemplate = null; } } facePositions = null; } Match = true; } else { Match = false; FaceImage = null; //AddMachValue(0); } // 创建CameraImage的副本 using (var tempCamImage = (System.Drawing.Image)camImage.Clone()) { //CameraImage = (System.Drawing.Image)tempCamImage.Clone(); Application.Current.Dispatcher.Invoke(() => { CameraImage = (System.Drawing.Image)tempCamImage.Clone(); }); } } else { //解绑事件,因为比对超时了 CurVideoSource.NewFrame -= CurVideoSource_NewFrame; MicSpeech.Instance.SpeakAsync("人脸认证失败"); Face_time_out = true; } } catch (Exception ex) { Logger.Warn(ex); //AddMachValue(0); } finally { // 确保所有资源都被释放 if (FCimge != null) { FCimge.Dispose(); FCimge = null; } if (Face != null) { Face.Dispose(); Face = null; } if (bitimage != null) { bitimage.Dispose(); bitimage = null; } if (camImage != null) { camImage.Dispose(); camImage = null; } stopwatch.Stop(); this.Debug($"人脸识别:{Match}\t相似度:{MatchValue}\t 用时:{stopwatch.ElapsedMilliseconds}ms"); stopwatch = null; } NewFrame_i = 0; } NewFrame_i = NewFrame_i+1; //释放当前帧资源(重要!) eventArgs.Frame.Dispose(); } /// <summary> /// 将 System.Drawing.Image 转换为字节数组 /// </summary> /// <param name="image">要转换的图像</param> /// <param name="format">图像格式(默认为 JPEG)</param> /// <returns>图像的字节数组表示</returns> public static byte[] ImageToByteArray(Image image, ImageFormat format = null) { if (image == null) return null; using (var memoryStream = new MemoryStream()) { image.Save(memoryStream, format ?? ImageFormat.Jpeg); return memoryStream.ToArray(); } } private void Timer_Tick(object sender, EventArgs e) { timer.Stop(); //比对成功和超时以及核验失败都要终止 if ((Face_match)||(Face_time_out)||(Face_abort)) { SafeStopCamera(); timer.Stop(); this.DialogResult = Person.IsCheck; this.Close(); return; } timer.Start(); // 注意:这里的代码在 UI 线程上运行,不要执行耗时操作 } /// <summary> /// BitMap压缩成JPG格式图片(弃用) /// </summary> /// <param name="camImage"></param> /// <returns></returns> private System.Drawing.Image BitmapToJpg(Bitmap camImage) { //System.Drawing.Image result = null; //using (System.IO.MemoryStream ms = new System.IO.MemoryStream()) //{ // camImage.Save(ms, System.Drawing.Imaging.ImageFormat.Jpeg); // result = System.Drawing.Image.FromStream(ms); // ms.Dispose(); //} //return result; return camImage; } private void bt_Close_PreviewMouseDown(object sender, MouseButtonEventArgs e) { e.Handled = true; CurVideoSource.NewFrame -= CurVideoSource_NewFrame; Face_time_out = true; Face_abort = true; //this.Close(); } // 实现IDisposable接口 public void Dispose() { Dispose(true); GC.SuppressFinalize(this); } protected virtual void Dispose(bool disposing) { if (!_disposed) { if (disposing) { // 释放非托管资源(如相机资源) // 这里需要添加释放相机资源的代码 // 例如:if (IsOpenCamera) { StopCamera(); } if (timer != null) { timer.Stop(); timer.Tick -= Timer_Tick; timer = null; } // 释放相机资源(无论是否托管) if (_SLDCamera != null) { // 调用CameraDevice的关闭/停止方法 _SLDCamera.FreeMemory(); _SLDCamera = null; } } _disposed = true; } } // 可选:添加析构函数作为安全网 ~FaceValidate() { Dispose(false); } } } 这份代码偶尔会出现 卡死状态,分析下原因,具体指出,请开启深度思考,多参考
最新发布
09-05
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值