Project

General

Profile

Actions

Bug #89386

closed

Backend module "FileList" not fully compatible with protected (not public accessible) directories

Added by Jan Kornblum over 4 years ago. Updated about 4 years ago.

Status:
Rejected
Priority:
Should have
Assignee:
-
Category:
Backend User Interface
Target version:
-
Start date:
2019-10-09
Due date:
% Done:

0%

Estimated time:
TYPO3 Version:
9
PHP Version:
7.2
Tags:
Complexity:
Is Regression:
Sprint Focus:

Description

Using backend module "FileList" it is not possible to view a file when it is located inside a protected (not public accessible, e.g. protected by .htaccess) directory. To reproduce:

  • Place any file (e.g. pdf) inside any folder (e.g. fileadmin/test/).
  • Also, put a .htaccess file inside the same folder containing "Deny from all".
  • Now go to "FileList" and navigate to this folder. Everything works fine until here (rendering previews etc.)...
  • Now click on the pdf icon -> info -> and inside the popup -> click on the button "show". This leads to "403 access denied".

So it would be much better to handle this "show" action by any php method (read the file contents, send headers and echo file content) instead of directly calling the public url of the file.

Actions #1

Updated by Georg Ringer over 4 years ago

  • Status changed from New to Needs Feedback

Thanks for creating the issue, however it is not that easy.

Imagine there are files with 10mb or 200mb or maybe 1gb, using PHP reading this file also means that it will be not so simple to implement

Actions #2

Updated by Jan Kornblum over 4 years ago

Georg Ringer wrote:

Thanks for creating the issue, however it is not that easy.

Imagine there are files with 10mb or 200mb or maybe 1gb, using PHP reading this file also means that it will be not so simple to implement

Hmm... Is this really such a (performance related?) problem? The BE module currently automatically renders previews for e.g "pdf"s, too (so a "big" file gets already read by PHP). Shouldn't just reading and echoing a huge file be easier?

If it's really not not possible: What about implementing a "switch" to still deliver files bigger than 500mb directly, but smaller files using my aproach?

I think this issue would be really very useful. I've got several projects where the application e.g. creates invoices or similar (which must be stored public inacessible). In this cases an editor should always be able to access this files using the backend module.

Actions #3

Updated by Hannes Strangmeier over 4 years ago

Some notes regarding performance, correct me where i am wrong:
1) rendering thumbnails for PDFs is done by ImageMagick / GraphicsMagick, which are specifically designed for this job. It is also only done once (unless you delete the created thumbnail). PHP is only involved in a way that it tells IM/GM to do the job.

2) Delivering files through PHP needs a lot of RAM, since the file itself is loaded into PHPs memory. So let's say you have a 500MB file, read it in PHP and then deliver it. It will need approximatley 540 MB of memory (in a short test memory_get_usage() for the filelist stated around 36 MB). Thereby a memory_limit of 512MB wouldn't even be enough to handle this request.

3) webservers do not need this amount of memory, because they read the file from your storage with kernel functions and deliver them more or less directly to the network-buffers of the kernel. (this is a highly simplified explanation!). So delivering files through the webserver itself saves a huge amount of resources, especially memory.
4) implementing basic functions of a webserver such as HTTP 206 codes, resume partially completed downloads etc. is a pain and might lead to even more memory usage (amongst other things because you have to read parts of the files)

If you have to deliver files through PHP i would highly recommend to check individual needs regarding your use case combined with the server you are using. Keep in mind that simultaneous users that access those files all need their own memory. So delivering a 500 MB file to 4 simultaneous users will need around 2 GB of your systems RAM. Also, this memory is blocked for the whole time that is taken to download the file by the user, since the php-process has to be running during this time. that being said, your max_execution_time also has to be high enough to deliver the files to the user (which might take a lot of time, if the user does not have much bandwidth).

I would not recommend to implement such a behaviour as a default solution.

Actions #4

Updated by Georg Ringer about 4 years ago

  • Status changed from Needs Feedback to Rejected

I reject the issue.

Actions

Also available in: Atom PDF