Wednesday, September 19, 2012

Robots.txt file in MVC3

Recently I was asked to look at adding a robots.txt file to a client's website that we built on ASP.NET MVC3. So I did some research and found out some interesting information about the robots.txt file.
  1. The file is just a suggestion and bots are not required to follow what you ask them to do via the robots.txt file.  
  2. The file is really an exclusion list instead of an inclusion list.  This means you have to put the places you don't want the bots to view which could be a bad idea in that it would give the bad bots areas they should focus on hacking.
After my research we decided to not put a robots.txt file on the website initially. Soon after our deploy we noticed in the ELMAH logs that we were seeing a considerable amount of errors which contained this error message:
The controller for path '/robots.txt' was not found or does not implement IController.

So now we decided we at least needed an empty robots.txt file out there to prevent all these unnecessary errors. So I did some more research and developed a solution for MVC3:
  • Basically you just add the physical robots.txt file to the website by adding it to the project at the root level. It could be empty or could contain the basic level of content required in a robots.txt.
Now that you have the physical file on the website it will ignore the ASP.NET MVC3 routing as long as you haven't changed the default setting of the RouteExistingFiles property of the RouteCollection which will ignore routing if a physical file is found that matches the URL pattern.

To ensure that the physical file will always be served up even if someone changes the RouteExistingFiles property you can add the following ignore route code to the global.asax.cs file:
routes.IgnoreRoute("{robotstxt}", new {robotstxt=@"(.*/)?robots.txt(/.*"});

Your mileage may vary with the robots.txt file and it might not be a bad idea to have a robots.txt with some exclusions if you really need to exclude some of your content from web crawlers or bots.

This particular client didn't really need one because most if not all the content of their website required that you log into their website so bots and web crawlers wouldn't get much content from crawling their entire site.

Tuesday, September 4, 2012

Using BIDS 2008 to access TFS 2010

Recently I had a client that needed to use BIDS to create some SSRS reports but they wanted to use TFS2010 for their source control.  The latest version of BIDS is really a VS2008 shell so I knew it wouldn't be as simple as installing Team Explorer.  I was sure I wasn't the first person to come across this issue so I pulled out my favorite research tool and did some research on the exact steps that need to be taken to enable Team Explorer for BIDS 2008.

I found several blog posts about the subject but this blog post by Joost van Rossum not only gave me the exact steps to Team Explorer in VS2008 so it would connect with TFS2010 but also was written from the perspective of using BIDS2008 and not just VS2008.  Here is an overview of the steps that Joost lined out in his blog post:
  1. Install Team Explorer 2008 (download link)
  2. Install SP1 for Visual Studio 2008 (download link)
  3. Install VSTS 2008 SP1 Forward Compatibility Update for TFS2010 (download link)
  4. Once you have all those pieces installed you have to create your server reference like this (http://<serverName>:<port>/<vdir>/<collectionName> (e.g. http://TfsServer:8080/tfs/ProjectCollectionName) which is different than you do in VS2010.

So I followed the above steps and thought everything was working great until I tried to log in to TFS using BIDS 2008 and received this error message popup:

Unable to switch servers at this time.  The Team Explorer is busy.


So I went back to my favorite research tool to do some more research and came across this blog post by Tom Hundley.

At the end of his post I discovered that my problem was with a single character!  So as soon as I removed the / on the end of my server url everything worked like a charm!

Thanks for the help Joost and Tom!