Protect folder from direct access

Asked

Viewed 8,626 times

9

The user needs to log in to access a list of documents.

All documents are in the "/uploads folder".

Once you know the directory, it is easy to access it, just type in the browser 'dominio.com/uploads'.

What to do to protect files from direct access and allow download only within the system?

I have some ideas, but I don’t know if they’ll be enough:

  • Use robots to avoid indexing by search engines.
  • Give complex filenames to avoid downloading by mistake.
  • Protect folder with . htpass and . htaccess (I don’t know what impact to get PHP to access folder).
  • Modify the system and replace the direct link with a download button, so the directory is not exposed.
  • 2

    Leave the folder with sensitive files out of public access (public_html, www, htdocs) and make a PHP script that fetches these files and makes them available for download.

  • @Adirkuhn but PHP is not allowed to access out-of-root folders. If you change the permissions, I won’t be creating an even bigger problem?

  • I think your last idea is the best and enough.

  • 3

    Usually if you have access to a level above public_html, now if you don’t you will have to use another solution.

  • 1

    Using . htaccess is a way out. PHP will continue to access the folder but apache will not allow direct access through the browser. But you will need to create a script in PHP to download.

3 answers

4

It depends on what you want. Each file belongs to a single user, and would it be bad if one user accessed another user’s files? If one person with access to a file passed the link to another, unauthorized, would that be a problem? etc.

Some comments on your proposals:

  • Use robots to avoid indexing by search engines.

That will only prevent crawlers who respect the robots.txt index your files - this will have no effect on those you do not respect (if any) or prevent any particular user from accessing those files. Not that it’s bad to do it, just insufficient...

Likewise, prevent your webserver return indexes (e.g.: when accessing dominio.com/uploads/ it gives you a list of all the files in the folder uploads) - disabling the option Indexes - also helps make it difficult for a visitor to find out which files are there, but also doesn’t stop someone with a link from downloading it.

  • Give complex filenames to avoid downloading by mistake.

This may be an appropriate technique, depending on your safety requirements (see response start). One way to implement is to create a link containing a UUID and/or the hash of the file, and allow anyone who has access to the link to download the file. Of course, your PHP will only deliver the link to logged in users, and if one of them passes/publishes the link it’s not much different from simply copying and handing the file to third parties...

(the biggest problem of this technique is "public relations" - we know that "guessing" a UUID is totally impractical, but people without technical knowledge tend to think "ah, but what if someone discovers the link? it is no longer safe to protect with password?"...)

  • Protect folder with . htpass and . htaccess (I don’t know what impact to get PHP to access folder).

I have no experience with PHP, but I believe that there is no impact for it to access the folder (this is more a matter of Directory Apache). As to effectiveness, assuming that you are using HTTPS and that your webserver is configured correctly (not exposing files .ht* users - what is standard in Apache) is a relatively safe approach. The only problems, second that response in security.SE, are usability (not exactly the most method user-friendly) and the lack of stronger protection against brute force attacks.

  • Modify the system and replace the direct link with a download button, so the directory is not exposed.

This is the most "guaranteed" way, although more boring to implement and possibly with the worst performance (not necessarily something unacceptable). It would be necessary to keep the folder uploads inaccessible (the medium doesn’t matter: put it outside the public_html, use .htaccess, use mod_rewrite, etc) and create a script downloads.php - which could combine an access check such as suggested in Lollipop’s response with the use of header and readfile (updating: the performance of this second part can be greatly improved through the use of mod_xsendfile; more details here).

Once done, you have full control over the form of authentication, the throttles applicable, etc. Most likely this is the safest way and it provides the best user experience. Should any of the above not be "good enough" for your specific requirements, this is the way I would suggest.

  • Thank you. Yes, each file belongs to a single user, and it would be bad if one user accessed another’s files. The solutions have been implemented, I would just like a second opinion to know if they would be sufficient for a minimum protection.

2

I have a way that can be a viable alternative. Use: $_SESSION to validate access to index.php of dominio.com/uploads. However, before, you need to do this on . htacess, to avoid direct access to the files:

RewriteCond %{REQUEST_FILENAME} !-f [OR]
RewriteCond %{REQUEST_FILENAME} \.pdf$
RewriteRule ^(.*)$ http://dominio.com/login.php$1 [L]

On the login screen, after authentication you would do this:

...

session_start();

$_SESSION['estoulogado'] = 1;

...

There in domain.com/uploads you would have something like this:

...

session_start();

if ($_SESSION['estoulogado'] != 1) {
    header("Location: http://dominio.com/login.php");
} else { 

$username = $_SESSION['user'];
$idusername = $_SESSION['iduser'];

}

...

CÓDIGO DA PÁGINA DE DOWNLOAD

...
  • Thank you. In fact the authentication system is already done and the download is performed through the use of header and readfile, just wonder if there were other solutions. Thank you for the reply.

  • I think he wants to prevent the listing of the files because he is probably without an "index.php" or no blocking of the Indexes option..

  • Filipe, if you are already using header and readfile then it would be much more practical to put the files in a private directory, outside the public folder.. got it ? By the way, using header and readfile to download a physical file on disk is not something very suitable for performance, etc.. but that’s not the focus in question.. You can discuss it another time.. I just leave this tip.

-1

Create a index.php inside the folder, in this file redirect to a safe, home-like location of your website.

<?php header("location: ../views/home.php");?>

Browser other questions tagged

You are not signed in. Login or sign up in order to post.