Hi
Is there a setting that we've got wrong that is causing this?
Nope, that's just how the site is set up, how many pages it has and how often the pages are changing, and how much traffic it has (which influence directly how often 4SEO works through the backlog of pages to analyze).
There are 2 ways this can be acted upon:
1 - Make 4SEO analyze pages more often: IF you slowed down analysis 4SEO, then let it go faster.
How often 4SEO analyze pages is decided under Configuration | System | Background processing | Pages between background processing.
That number is 1 by default, meaning that for each visit on the frontend, one page will be analyzed.
If for some reason you increased that number, say to 10, it would mean 4SEO would only analyze one page every 10 frontend visits.
Lower that number, or set it back to 1, to increase the analysis speed.
2 - Make 4SEO analyze less pages
Likely the best option. You can exclude groups of pages from the analysis entirely under Pages | Settings | Excluded URLs
There you should exclude all pages that have no SEO values. That entirely depends on your site content and you need to look at what the site content is to decide on what to exclude.
Exclusion is done with expressions such as /users/{*} for instance (users profile, content edit page, page variations based on price, colors, display, sorting variables, etc) are all good candidates to be excluded.
You can also exclude pages using robots.txt (4SEO crawler complies with robots.txt rules).
This would have the added benefit of also excluding these pages from Google, which would help focus Google's attention on your actual valuable content.
Best regards
Yannick Gaultier
weeblr.com / @weeblr