Hi,
Just a quick question; why does the robots.txt generated from WB by default rule set "allows all" despite I have set each page to either allow or disallow.
For example:
this is what I get from WB
User-agent: *
Allow: /
Sitemap: http://www.mydomian.com.au/sitemap.xml
Instead of what I should be getting:
User-Agent: *
Disallow:
Allow: /
Allow: /index
Disallow: /privacy
Disallow: /terms
Disallow: /error-contact
Disallow: /sucsess
Sitemap: http://www.mydomian.com.au/sitemap.xml
Thanks
Kevin
robots.txt
Forum rules
IMPORTANT NOTE!!
DO YOU HAVE A QUESTION OR PROBLEM AND WANT QUICK HELP?
THEN PLEASE SHARE A "DEMO" PROJECT.
PLEASE READ THE FORUM RULES BEFORE YOU POST:
http://www.wysiwygwebbuilder.com/forum/viewtopic.php?f=12&t=1901
MUST READ:
http://www.wysiwygwebbuilder.com/getting_started.html
WYSIWYG Web Builder FAQ
IMPORTANT NOTE!!
DO YOU HAVE A QUESTION OR PROBLEM AND WANT QUICK HELP?
THEN PLEASE SHARE A "DEMO" PROJECT.
PLEASE READ THE FORUM RULES BEFORE YOU POST:
http://www.wysiwygwebbuilder.com/forum/viewtopic.php?f=12&t=1901
MUST READ:
http://www.wysiwygwebbuilder.com/getting_started.html
WYSIWYG Web Builder FAQ
Re: robots.txt
The default rule can be set in the first 'rule' property of the robots.txt dialog.
https://wysiwygwebbuilder.com/robots_txt.html
https://wysiwygwebbuilder.com/robots_txt.html