Thank you, mcrider.
I've generated an XML sitemap file (with: http://www.xml-sitemaps.com
) and then placed the sitemap file in the root of my website.
There is no much detail or options to choose when creating the sitemap file:
- the journal url
- change frequency: set at None
- last modification: set at "Use server response"
- Priority: set at "None".
I left these option by default and then placed the file created in journal folder.
Should I change these feature to fix the problem?
Regarding robot.txt, it seems not so efficient to block robot and web crawlers as "bad" robot will ignore robot.txt file.