Is PageSpeed Insights wrong?

Is PageSpeed Insights wrong? PageSpeed Insights

reghost

Well-Known Member

Vip Member
Reputation: 16%
Joined
25/8/21
Messages
117
When I ran an analysis with PageSpeed Insights, it says:

Search engines can't include your pages in search results unless they have permission to crawl them. Learn more about crawler directives.
Blocking Directive Source
head > meta
<meta name="robots" content="NOINDEX, NOFOLLOW" />
You must give search crawlers access to your app in order for it to appear in search results.
But when I right-click on the page and select "Developer Tools -> View Page Source" then <meta name="robots" content="NOINDEX, NOFOLLOW" /> is not included.

There is nothing stopping Google Search, this is what my robots.txt looks like.

Code:
You must log in to view
(81 lines)
On Google Search I get this:

Your sitemap appears to be an HTML page. Please use a supported format instead.

I have added sitemap.xml to my XML sitemap generation settings
The page seoptimer.com says this:
We have not detected or been able to retrieve an XML sitemap file successfully.
Sitemaps are recommended to ensure that search engines can intelligently crawl all of your pages

What can I do to solve this? I turned off Cloudflare and just used my web host, but it's still the same error, then it can't be a Cloudflare setting.

Are there more people who have or have had the same problem, if so, how did you solve it?
 
I'm tracking down the error and it seems to be an addon that's getting in the way somewhere.
 
I get an error when validating on the page
Errors
Incorrect http header content-type: "text/html; charset=utf-8" (expected: "application/xml")
 
It's so strange, now it's completely ok when I validated again and I haven't changed anything yet. :oops:
 
It is XFDev - Proxy Check 1.4.2 that is the problem, then I have found the error.
The problem is solved, I hope. (y)
 
Similar threads Most view View more
Back
Top Bottom