It depends on what you want. Each file belongs to a single user, and would it be bad if one user accessed another user’s files? If one person with access to a file passed the link to another, unauthorized, would that be a problem? etc.
Some comments on your proposals:
- Use robots to avoid indexing by search engines.
That will only prevent crawlers who respect the robots.txt
index your files - this will have no effect on those you do not respect (if any) or prevent any particular user from accessing those files. Not that it’s bad to do it, just insufficient...
Likewise, prevent your webserver return indexes (e.g.: when accessing dominio.com/uploads/
it gives you a list of all the files in the folder uploads
) - disabling the option Indexes
- also helps make it difficult for a visitor to find out which files are there, but also doesn’t stop someone with a link from downloading it.
- Give complex filenames to avoid downloading by mistake.
This may be an appropriate technique, depending on your safety requirements (see response start). One way to implement is to create a link containing a UUID and/or the hash of the file, and allow anyone who has access to the link to download the file. Of course, your PHP will only deliver the link to logged in users, and if one of them passes/publishes the link it’s not much different from simply copying and handing the file to third parties...
(the biggest problem of this technique is "public relations" - we know that "guessing" a UUID is totally impractical, but people without technical knowledge tend to think "ah, but what if someone discovers the link? it is no longer safe to protect with password?"...)
- Protect folder with . htpass and . htaccess (I don’t know what impact to get PHP to access folder).
I have no experience with PHP, but I believe that there is no impact for it to access the folder (this is more a matter of Directory
Apache). As to effectiveness, assuming that you are using HTTPS and that your webserver is configured correctly (not exposing files .ht*
users - what is standard in Apache) is a relatively safe approach. The only problems, second that response in security.SE, are usability (not exactly the most method user-friendly) and the lack of stronger protection against brute force attacks.
- Modify the system and replace the direct link with a download button, so the directory is not exposed.
This is the most "guaranteed" way, although more boring to implement and possibly with the worst performance (not necessarily something unacceptable). It would be necessary to keep the folder uploads
inaccessible (the medium doesn’t matter: put it outside the public_html
, use .htaccess
, use mod_rewrite
, etc) and create a script downloads.php
- which could combine an access check such as suggested in Lollipop’s response with the use of header
and readfile
(updating: the performance of this second part can be greatly improved through the use of mod_xsendfile; more details here).
Once done, you have full control over the form of authentication, the throttles applicable, etc. Most likely this is the safest way and it provides the best user experience. Should any of the above not be "good enough" for your specific requirements, this is the way I would suggest.
Leave the folder with sensitive files out of public access (public_html, www, htdocs) and make a PHP script that fetches these files and makes them available for download.
– Adir Kuhn
@Adirkuhn but PHP is not allowed to access out-of-root folders. If you change the permissions, I won’t be creating an even bigger problem?
– Filipe Moraes
I think your last idea is the best and enough.
– Franchesco
Usually if you have access to a level above public_html, now if you don’t you will have to use another solution.
– Adir Kuhn
Using . htaccess is a way out. PHP will continue to access the folder but apache will not allow direct access through the browser. But you will need to create a script in PHP to download.
– gmsantos