• Home
  • Get help
  • Ask a question
Last post 34 min ago
Posts last week 71
Average response time last week 1 hour 3 min
All time posts 74569
All time tickets 11482
All time avg. posts per day 19

Helpdesk is open from Monday through Friday CET

Please create an (free) account to post any question in the support area.
Please check the development versions area. Look at the changelog, maybe your specific problem has been resolved already!
All tickets are private and they cannot be viewed by anyone. We have made public only a few tickets that we found helpful, after removing private information from them.

#11971 – 4SEO Analysis and Sitemap Issues in Local Setup

Posted in ‘4SEO’
This is a public ticket. Everybody will be able to see its contents. Do not include usernames, passwords or any other sensitive information.
Wednesday, 27 August 2025 06:26 UTC
marketing-siam-it-com

Good morning,

I can't get the pages to be analysed. When I click "Analyse", the number of pending pages increases until it reaches a limit, then it gradually drops back to zero. However, both the "analysed" and "errors" sections remain at zero the entire time. TLS verification is disabled because I'm working locally.

I did manage to get broken links detected at one point, but now when I run the analysis again, everything resets to zero.

In the sitemap section, it does show the total number of images, but the URLs are left empty. When I click the sitemap link, although it says there are 359 images, the XML shows everything as zero, and no XML file is created in the specified path.

What am I missing in the configuration?

Many thanks,
Best regards

Wednesday, 27 August 2025 07:51 UTC
wb_weeblr

Hi

What am I missing in the configuration?

Well, what "local setup" are you using, exactly?

4SEO does not really care about where your site is hosted, as long as the Website home address is the current one (ie localhost, something.dev,...).

When I click "Analyse", the number of pending pages increases until it reaches a limit, then it gradually drops back to zero.

Exactly as expected, that's the normal analysis process when you trigger it manually with Analyze Now.

However, both the "analysed" and "errors" sections remain at zero the entire time.

This means that 4SEO analyzed your home page, found links, analyzed these pages and found that they don't pass the rules you set to be included. For instance, they have noindex robots tags, or are excluded by robots.txt, etc

When I click the sitemap link, although it says there are 359 images, the XML shows everything as zero

What does "everything as zero" means? exactly? where do you see zeros?

and no XML file is created in the specified path

There will never be any actual file created. 4SEO does not create an .xml file for the sitemap.

Best regards

Yannick Gaultier

weeblr.com / @weeblr

 

 
Wednesday, 27 August 2025 13:36 UTC
marketing-siam-it-com

Hello,

The site is set up on a local server and we access it via DNS — in this case, it's https://nombre.empresa.local/, which is what we've configured as the site's main address.

"Exactly as expected, that's the normal analysis process when you trigger it manually with Analyse Now." → I'm not sure I understand this part. Does that mean it's normal for no pages to be analysed? The website is available in 12 languages, it's quite large and has many pages indexed in Google.

Regarding the sitemap, I'm referring to the two images attached.

Many thanks,
Best regards

 

Wednesday, 27 August 2025 13:50 UTC
wb_weeblr

Hi

in this case, it's https://nombre.empresa.local/, which is what we've configured as the site's main address.

Is that a real HTTPS certificate? or self-signed? If self-signed, and you did not disable certificate check in 4SEO site analysis settings, then the PHP crawler will refuse to connect.

Does that mean it's normal for no pages to be analysed?

Absolutely all pages are analyzed (unless you used a self-signed certificate). But the result of the analysis is that no page is kept as valid in the pages list. That's why the sitemap lists no page. You will have pages listed in the sitemap when you have pages on the Pages page in 4SEO. The sitemap is essentially a copy of what's in the Pages page.

So they were all analyzed. And likelyrejected. Which is why I wrote:

This means that 4SEO analyzed your home page, found links, analyzed these pages and found that they don't pass the rules you set to be included. For instance, they have noindex robots tags, or are excluded by robots.txt, etc

Is that that case?

Best regards

Yannick Gaultier

weeblr.com / @weeblr

 

 

 
Thursday, 28 August 2025 05:55 UTC
marketing-siam-it-com

Hi,

Thank you very much for your help. Honestly, I’m not sure why the pages aren’t being recognised. The certificate is self-signed, but I’ve ticked the option shown in the screenshot I’ve attached.

I also don’t think it’s the case you mentioned — I’ve attached an image showing what’s excluded in the robots.txt, and most pages don’t have a noindex tag.

I think I’ll try running the analysis directly on the remote server to rule out whether the issue is related to being in a local environment.

When we upload the site to the remote server, we have a client area with a login. After logging in, users are logged out and see a message saying the cache has expired. If I disable the 4SEO plugins, it works again. What configuration should I change to prevent this from happening?

Many thanks,
Best regards

Thursday, 28 August 2025 08:16 UTC
wb_weeblr

Hi

The certificate is self-signed, but I’ve ticked the option shown in the screenshot I’ve attached.

That's still a red flag. Have you checked logs in the Joomla log folders (you can use our 4Logs small plugin for that)? There may be some information there. 

Another issue I see sometimes with custom setup like yours is DNS. I'm not sure how you set up DNS to use that domain locally but the analysis process is done by your own website (ie PHP) making HTTP requests to your site. So if PHP cannot resolve that name, the requests will fail.

That's another posisble reason, but your local setup is likely to be the root cause here. It's of course normal to use 4SEO on a local setup, I personally only develop locally (on windows even!).

If I disable the 4SEO plugins, it works again.

Which absolutely never means 4SEO is actually causing the issue. 

What configuration should I change to prevent this from happening?

There's nothing in 4SEO itself that can cause anything like this. So what you have to look at is all the configuration you made on top of it:

- the Website home address of course

- disable all the "Always redirect to home address"

- disable all redirect rules (there's an option in the Settings button in the toolbar to disable all rules with one click)

- same for any other SEO rules and replamcent rules

Best regards

Yannick Gaultier

weeblr.com / @weeblr

 

 
Tuesday, 02 September 2025 05:18 UTC
marketing-siam-it-com

Good morning,

I’ve finally managed to get the pages to be analysed. I disabled the CDN and external cache options. The thing is, we’re not actually using any CDN on the site, so I’m not sure if the DNS setup is causing it to be detected that way.

The analysis hasn’t finished yet, but at least it’s now running.

Thank you very much for your support.

Best regards

Tuesday, 02 September 2025 09:06 UTC
wb_weeblr

Hi

I disabled the CDN and external cache options

When the "bypass CDN" option is enabled, 4SEO appends to all URLs a random query variable (ie ?_cdnbust=917d65da-483b-4afa-bbf2-6887b6922597) when fetching pages to be analyzed. This way, we get a fresh page from any caching or CDN mechanism that can be in place.

That's all it does. We can't tell - and don't even try - to know if you use a CDN or caching at server level. The only thing 4SEO cares about is to bypass any such mechanism so that when it fetches a page to analyze it, the page is rendered by joomla and we do not get a cached response - whether cached by a CDN or varnish on your server, or nginx fastcgi cache, or anything else.

Then when that request is rendered by Joomla, the first thing 4SEO does is remove that query variables from the request data. 

Best regards

Yannick Gaultier

weeblr.com / @weeblr

 

 

 
Tuesday, 02 September 2025 09:20 UTC
marketing-siam-it-com

Hi Yannick,

Now I’m wondering about the response. After disabling it, the pages are being analysed — is that okay as it is? Or is there something else I should change?

Best regards,

Tuesday, 02 September 2025 09:57 UTC
wb_weeblr

Hi

Again: absolutely no idea. Your specific local setup is doing this. I don't know what it does, how apache/nginx or whatever you use is configured, what it passes back to PHP, hide from PHP, etc And even if I knew, I could not possibly give you good advice because the only thing 4SEO is in control of if inside of Joomla. The webserver environment is not something I can control, support or advise about.

If it works now, it's all good, but I can't tell you why it works now and did not work before because 4SEO is strictly not involved here.

What 4SEO needs is that when it makes a regular HTTP requests from your Joomla site to your Joomla site using Joomla HTTP PHP client, that request goes through your webserver and into Joomla, just like any visitor request. All requests headers have to be passed normally. If that happens, then everything works.

Best regards

Yannick Gaultier

weeblr.com / @weeblr

 

 
Tuesday, 02 September 2025 12:07 UTC
marketing-siam-it-com

Hello,

OK, thank you so much for your help. It seems we’re making progress step by step. I’ve now managed to get all the pages analysed — over 600 of them. But currently, none of the URLs are being recognised in the sitemap.

For now, I haven’t created any rules, as I want to make sure everything is being picked up correctly first. In the page list, the sitemap column shows “auto” for all entries, with an “X” next to it — I assume that means they’re excluded by default.

Where can I change this setting so that URLs are included in the sitemap by default, unless I specify otherwise?

Sorry for all the questions — I didn’t expect it would be this tricky to implement. Thank you again for your time.

Best regards

Tuesday, 02 September 2025 12:15 UTC
marketing-siam-it-com

Hi,

Right, it seems I jumped ahead with my last question. It looks like the pages aren’t recognised until you actually browse the site — now it’s starting to include some of them.

You mentioned that you also work locally — how do you handle this kind of situation? Do you just wait and rebuild the sitemap once the site is live? Is there a way to tell it to build the sitemap with all pages, even if they haven’t had any visits? Or what would be the best approach in this case?

Many thanks!

Tuesday, 02 September 2025 16:26 UTC
wb_weeblr

Hi

how do you handle this kind of situation?

I don't. There never was any sort of issue with 4SEO or any other Joomla site. You run a local webserver, apache, mysql, php and that jsut works. There's nothing special about being local.

It looks like the pages aren’t recognised until you actually browse the site

Well, yes, that's normal and expected. Background processing in Joomla is triggered by frontend visitors. No visitors, no work can be done.

That's why there's a manual analysis. You should read the Getting started overview about pages.

Best regards

Yannick Gaultier

weeblr.com / @weeblr

 

 
Friday, 03 October 2025 05:34 UTC
system
This ticket has been automatically closed. All tickets which have been inactive for a long time are automatically closed. If you believe that this ticket was closed in error, please contact us.
This ticket is closed, therefore read-only. You can no longer reply to it. If you need to provide more information, please open a new ticket and mention this ticket's number.