Posted - 04/07/2020 : 12:30:20
As it happens, this may have something to do with the issue. I had a look at the instructions and there are a couple of rules missing to exempt the robots.txt and xml files... <rule name="Product Detail Page Match" stopProcessing="true"> <match url="^([^/]+)/?$" /> <conditions> <add input="{URL}" pattern="\.asp" negate="true" /> <add input="{URL}" pattern="\.css" negate="true" /> <add input="{URL}" pattern="\.gif" negate="true" /> <add input="{URL}" pattern="\.jpg" negate="true" /> <add input="{URL}" pattern="\.js" negate="true" /> <add input="{URL}" pattern="\.png" negate="true" /> <add input="{URL}" pattern="\.xml" negate="true" /> <add input="{URL}" pattern="robots\.txt" negate="true" /> <add input="{UNENCODED_URL}" pattern="[^/]+" /> </conditions> <action type="Rewrite" url="proddetail.asp?prod={UrlEncode:{R:1}}" /> </rule> Vince Click Here for Shopping Cart SoftwareClick Here to sign up for our newsletterClick Here for the latest updater
|