将rails应用从linux平台迁移到windows2003,并将代理服务器从lighttpd转为了iis,试用了一下,基本正常。
但后台发现上传图片时,报了个错误 Size is not included in the list
中午查了一下,本以为是size超出了限制,发现没有超过,打了下调试信息,发现self.size为0, 但回退再次上传成功,应该是文件在保存上出了问题,
google之,发现有人解决了,用的是size之前sleep 5,晕倒,这解决方案也太随便了。
blogpost上有另外一个老外的解决方法,不错。转过来算了,我痛恨gfw.
Size is not included in the list
Fixing attachment_fu on Windows
Like many others, I've encountered issue when developing Rails applications using attachment_fu on Windows. After doing some research, I've come up with the following solution to the problem.The problem has two parts :
- Size is not included in the list error message,
- Timeout error when uploading to S3.
Fixing "Size is not included in the list" error message
Some people have reported that there is a timing issue, when trying to get the file size, with Tempfile on Windows. It seems that the size of the file is not properly reported by Windows after writing data to it. Proposed solutions for this problem include :- Sleeping in a loop as long as the file size is 0,
- Reading back the entire file in memory.
I think I found a better and less patchy solution for this issue: forcing the OS to flush the file to disk before reading it's size.
Here is the code to do it :
require 'tempfile' class Tempfile def size if @tmpfile @tmpfile.fsync # added this line @tmpfile.flush @tmpfile.stat.size else 0 end end end
Doing a flush is not enough... flush will flush the Ruby buffer but the file may not be immediately written to the disk by the OS. Doing the fsync ensure that the file is written to disk by the OS before continuing. After that, Windows will properly report the actual file size.
Fixing the Timeout error when uploading to S3
This issue is related to opening files for reading on Windows. On Windows, you have to open the file in binary mode. So patching attachment_fu is simple :require 'technoweenie/attachment_fu/backends/s3_backend' module Technoweenie module AttachmentFu module Backends module S3Backend protected def save_to_storage if save_attachment? S3Object.store( full_filename, (temp_path ? File.open(temp_path, "rb") : temp_data), # added , "rb" bucket_name, :content_type => content_type, :access => attachment_options[:s3_access] ) end @old_filename = nil true end end end end end I've also included a fix from someone else (which was not enough in itself to solve my S3 upload problem): module Technoweenie module AttachmentFu # Gets the data from the latest temp file. This will read the file into memory. def temp_data if save_attachment? f = File.new( temp_path ) f.binmode return f.read else return nil end end end end
Wrapping it up
So I put all this code in lib/attachment_fu_patch.rb and required it in environment.rb.Problem fixed!
Note, I did not test it on other OSes, but these fixes should not have any adverse effects.