0
I work using Symfony 2.5, Heroku and Amazon S3 hosting for the static files. I am working on a project where the user can upload very large images, and this ends up locking the server of my app or giving timeout because the server can not complete the transfer of images.
I think the best solution would be a direct transfer of the file to S3, but I’ve never done this and I don’t want to take a chance without thinking a little about it.
Any suggestions?
Thank you!
also use gaufrette the problem is that now I will no longer perform uploads with PHP only with javascript, and this seems to me very prone to crashes because I will have to upload the file in S3 and then update my app, not to mention that I will have to keep signing all requests made to Amazon.
– Fábio Lemos Elizandro
You can use a public download to upload the files. Also, if your Bundle is well configured, requests for S3 from your application are very simple.
– Rodrigo Rigotti
in a public Bucket anyone can stay climbing files ?
– Fábio Lemos Elizandro
Yes, read here: http://docs.aws.amazon.com/AmazonS3/latest/dev/example-bucket-policies.html However, I still think it’s better to try to find an upload solution for your application (that would put the file in the Bucket) than to allow public access.
– Rodrigo Rigotti
Yes, I think I will increase the timeout of my PHP script in these requests to escape these complications. Thank you @Rodrigo-Rigotti
– Fábio Lemos Elizandro
I think it’s best to limit the upload size to the images, don’t you think?
– Rodrigo Rigotti
is a requirement of the system to support large image sizes
– Fábio Lemos Elizandro