Feature #44210
closedAdd more effective ways to work with large index
0%
Description
- large index with 3,9GB in index_fulltext and 3,1GB in index_rel
- several websites, languages and therefor quite large
- trying to reduce index by excluding certain branches from the index (and deleting that data)
Using the regular means to clear data from the index (using Info-module on a subtree) takes "ages" since quite a large number of individual queries are run and the index still is huge.
Updated by Oliver Hader over 11 years ago
- Project changed from 1382 to TYPO3 Core
Updated by Mathias Schreiber almost 10 years ago
- Status changed from New to Needs Feedback
- Assignee set to Mathias Schreiber
Any pointers what to do here?
Large datasets tend to get slow. That's normal.
Updated by Stefan Neufeind almost 10 years ago
Maybe some kind of cleanup-task, maybe a scheduler-job or cli-job? So you'd be able to add certain excludes and it would look which parts can get mass-removed from the index-tables. Doing that in a cli-job would be much better because we don't risk timeouts and you'll save the manual searching/clicking to clean up certain parts.
Updated by Alexander Opitz over 9 years ago
- Status changed from Needs Feedback to New
Updated by Susanne Moog about 6 years ago
- Status changed from New to Rejected
As there was no interest in picking up this topic in the last few years and there are existing extension solutions for handling bigger search needs (like elasticsearch or solr), this issue is closed for now.
Updated by Benni Mack over 4 years ago
- Sprint Focus changed from PRC to Needs Decision