How to Handle Large File Uploads?
I'm actually posting this as a question. If you're looking for the answer, sorry I don't have it yet.
How can we reasonably handle large file uploads? I'm talking in the >100MB range; YouTube, for instance, now supports 2GB files, and this will become increasingly the norm. I don't think that most servers are up to that yet, particularly if you need an application to scale.
Currently, using PHP, you need to set memory_limit to more than twice the upload_max_filesize, which as you can see would be prohibitive in the example of 2GB uploads; you'd need to set your PHP memory to >4GB (adding the buffer of 64M or whatever you need to run Drupal). EDIT: Looks like I was incorrect in my assumption; if you're not going to process the file, you don't need a huge memory footprint just to handle the raw uploads. Thanks Nate and Jamie!
Even if you manage to have that kind of resource available, you can probably expect things to splode with concurrent uploads...
So I spent some time yesterday looking at SWFUpload yesterday (module here), as I'd misunderstood its claims. Yes, it handles large file uploads (from the browser's standpoint), but you still need to set PHP memory accordingly. Not suitable for what I'm looking for, but it is a really nice way to handle multiple uploads. WARNING: I also learned from experience and much head-scratching that it doesn't work if you have Apache authentication on your server...
Sorry if you came to this post looking for answers; I've simply postulated more questions. But I'm hoping that someone with more experience with this issue might be able to comment, and we'll all benefit from it. Additionally, this might turn out to be a handy addition to the Media suite, perhaps as a fancy stream wrapper for handling large files? And I'll definitely follow-up when I figure out how best to tackle this.