Collection of problems with large sites
Can be many pages, many files, many directories, many categories, etc. or a combination.
The problems are mostly related to:
- performance and resource (memory) problems.
- exceptions, e.g. database query - “Prepared statement contains too many placeholders”
- usability: e.g. long, unusable lists (see sys_filemount: long list of directories + performance issue, unable to filter for specific pages in pagetree filter etc.)
Can the question of large sites be addressed in general?
- consider large sites when creating concepts for new features
- create a reference "large site"
- test patches on “large sites”
A very important change in the heads of core devs or in the "mentality" in the future must be that new features are not developed simply "parallel" to the core, but so that it respect the already existing architecture. For example did the redirects module that came with v9 totally ignore user permissions (if you are allowed to see the module you just see all redirects from all domains and not just the ones you are allowed to see - no simple "bug", just not implemented at all).
Or #88672 just assumed totally arbitrary (with hardcoded magic number) that there are a maximum of 100 pages with the same name.
Or #91995 just assumed that never an unconfigured domain (in sys_domain or Site YAML) can (accidentally!) point to a TYPO3 system, leading to arbitrary output.
This shows that many new features are implmented from a single POV: single/few domain setups with mostly admin BE users. This mentaility automatically also leads to the problems in this epic: performance problems.
Updated by Sybille Peters 8 months ago
@Stefan P I don't think it is intentionally. Some new features / changes are already pretty complex in themselves and considering user permissions, languages, workspaces etc. makes it even more complex. Really thinking everything through for multiple users, multiple sites, large sites - not an easy task.
But I agree with you that these things are not always considered, which is the reason I started this issue.
I think finding a method for testing with large sites (also with multiple users, multiple domains etc.) would at least make potential problems detected early.
What I also find missing is a definition of the limits. I realize, this is not easy to do. In some cases it also depends on the memory allocated and on other factors. But if you have a reference site and say, ok TYPO3 11 was tested with this reference site with 100 000 pages, 2000 backend users, 300 backend groups, 5 sites, 2000 redirects etc. Don't know if this is realistic, it might require some looking into.
And maybe additionally post some numbers, where problems may be expected, e.g. are there currently any hard limits?
And also finding best practices for how to solve things in general in the code might also help.
From my practical experience with a "large site" not everything is a huge problem and it depends. But I did expect TYPO3 to work well with large sites and as an enterprise solution, I think it should. Most of the problems are just a nuisance or do not affect us but some have been a big problem and have made time - and cost-intensive workarounds necessary.