Articles by Upendo Ventures

We "Upendo" to Write for You!

Our team loves to share our thoughts, research, experiences here in our blog. Please enjoy our articles and leave a comment to let us know what you think.

DNN Summit Session: Multi-Site DNN Instances and Robots.txt

Upendo Ventures: DNN SEO and Robots.txt

Our first session at DNN Summit 2021 this year was a short introduction to search engine optimization (SEO) and then a deeper dive on how to manage the Robots.txt file in your DNN website. Management of this critical SEO file is easier than ever on DNN 9.8 and newer. This blog post is a follow-up to provide resources for the event attendees.

In short, you now can manage your robots.txt file(s) directly within DNN itself while logged in as a superuser, and you should review this file regularly, and especially before you launch a new website.

Slide Deck

Multi-Site Support in DNN

In order to support multiple sites that each have their own robots.txt file, leave the default one the way it is already, for your primary website. Then, add a new one for each additional site you wish to support. Usually, this will be a copy of the original robots.txt, but the file name will be preceded by the domain. For example, if the additional site is, then the new robots.txt file for that site will simply be, instead of robots.txt.

Now, when you're logged in as a superuser, you can edit any of the robots.txt files that are found in the root of the website folder, using the existing Config Manager feature.

Code Sample

In order to get this fully working when requested by search engines and other bots, you'll need to first be sure the IIS redirect module is installed, then add redirects for each like you see in the example below. You'll want to put the redirects inside of the <system.webServer> section.

        <rule name="Robots.txt: Demo Site 1" stopProcessing="true">
          <match url=".+" />
            <add input="{HTTP_HOST}" pattern="localhost.demo1" />
            <add input="{REQUEST_FILENAME}" pattern="robots.txt" />
          <action type="Rewrite" url="localhost.demo1.{C:0}" logRewrittenUrl="true" />
        <rule name="Robots.txt: Demo Site 2" stopProcessing="true">
          <match url=".+" />
            <add input="{HTTP_HOST}" pattern="localhost.demo2" />
            <add input="{REQUEST_FILENAME}" pattern="robots.txt" />
          <action type="Rewrite" url="localhost.demo2.{C:0}" logRewrittenUrl="true" />
        <rule name="Robots.txt: Demo Site 3" stopProcessing="true">
          <match url=".+" />
            <add input="{HTTP_HOST}" pattern="localhost.demo3" />
            <add input="{REQUEST_FILENAME}" pattern="robots.txt" />
          <action type="Rewrite" url="localhost.demo3.{C:0}" logRewrittenUrl="true" />

Keeping the example code above in mind, now, when you are on the localhost.demo website and ask for the robots.txt file, you'll see the specific file that's only maintained for that website.

Contact Us
Let's Chat!

We'd love to work with you. Let's talk about how.

Contact Us

About the Author

Will StrohlFounder & CEO
Upendo Ventures
Overall, Will has nearly 20 years of experience helping website owners become more successful in all areas, including mentoring, website development, marketing, strategy, e-commerce, and more.

blog comments powered by Disqus

Stay Informed

  • Join our newsletter. Don't worry. We don't share, sell, or spam.

About Our Company

We use technology to help your business change people's lives! Our business is dedicated to implementing best practices, automations, and integrations to help your business grow and generate more leads online. Our battle-tested techniques help give you time back so you can worry about your business, and not the technology that runs it.

The Upendo team is proud to be a DNN partner, DNN consultant, DNN expert, DNN developer, offer the best DNN support - as well as the people behind the best DNN shopping cart e-commerce solution, Hotcakes Commerce.