Tuesday, November 13, 2012

Setting up DBMail in SQL Server 2012

Recently I needed to setup database mail in SQL Server 2012 and here is what I did (so if/when i have to do it again I will remember how to do it):

Run this script to create the sysmail account and profile:
use msdb
GO
DECLARE @ProfileName VARCHAR(255)
DECLARE @AccountName VARCHAR(255)
DECLARE @SMTPAddress VARCHAR(255)
DECLARE @EmailAddress VARCHAR(128)
DECLARE @DisplayUser VARCHAR(128)

SET @ProfileName = 'DefaultDBMailProfile';
SET @AccountName = 'DefaultDBMailAccount';
SET @SMTPAddress = 'smtp server address goes here';
SET @EmailAddress = 'from email address goes here';
SET @DisplayUser = 'from display name goes here';

EXECUTE msdb.dbo.sysmail_add_account_sp
@account_name = @AccountName,
@email_address = @EmailAddress,
@display_name = @DisplayUser,
@mailserver_name = @SMTPAddress

EXECUTE msdb.dbo.sysmail_add_profile_sp
@profile_name = @ProfileName

EXECUTE msdb.dbo.sysmail_add_profileaccount_sp
@profile_name = @ProfileName,
@account_name = @AccountName,
@sequence_number = 1 ;

Run this script to enable Database Mail on the server (MSDN Link to Database Mail XPs Server Configuration Option):
USE Master
GO
sp_configure 'show advanced options', 1
GO
reconfigure with override
GO
sp_configure 'Database Mail XPs', 1
GO
reconfigure 
GO
sp_configure 'show advanced options', 0
GO

To test and make sure you have it setup you can use this script:
EXEC msdb.dbo.sp_send_dbmail
@recipients = 'to email address goes here',
@body= 'Test Email Body', 
@subject = 'Test Email Subject',
@profile_name = 'DefaultDBMailProfile'

Please note: many of the variables in the above scripts have test data in them. Be sure to set the variables to the values that apply to your environment.

Wednesday, September 19, 2012

Robots.txt file in MVC3

Recently I was asked to look at adding a robots.txt file to a client's website that we built on ASP.NET MVC3. So I did some research and found out some interesting information about the robots.txt file.
  1. The file is just a suggestion and bots are not required to follow what you ask them to do via the robots.txt file.  
  2. The file is really an exclusion list instead of an inclusion list.  This means you have to put the places you don't want the bots to view which could be a bad idea in that it would give the bad bots areas they should focus on hacking.
After my research we decided to not put a robots.txt file on the website initially. Soon after our deploy we noticed in the ELMAH logs that we were seeing a considerable amount of errors which contained this error message:
The controller for path '/robots.txt' was not found or does not implement IController.

So now we decided we at least needed an empty robots.txt file out there to prevent all these unnecessary errors. So I did some more research and developed a solution for MVC3:
  • Basically you just add the physical robots.txt file to the website by adding it to the project at the root level. It could be empty or could contain the basic level of content required in a robots.txt.
Now that you have the physical file on the website it will ignore the ASP.NET MVC3 routing as long as you haven't changed the default setting of the RouteExistingFiles property of the RouteCollection which will ignore routing if a physical file is found that matches the URL pattern.

To ensure that the physical file will always be served up even if someone changes the RouteExistingFiles property you can add the following ignore route code to the global.asax.cs file:
routes.IgnoreRoute("{robotstxt}", new {robotstxt=@"(.*/)?robots.txt(/.*"});

Your mileage may vary with the robots.txt file and it might not be a bad idea to have a robots.txt with some exclusions if you really need to exclude some of your content from web crawlers or bots.

This particular client didn't really need one because most if not all the content of their website required that you log into their website so bots and web crawlers wouldn't get much content from crawling their entire site.

Tuesday, September 4, 2012

Using BIDS 2008 to access TFS 2010

Recently I had a client that needed to use BIDS to create some SSRS reports but they wanted to use TFS2010 for their source control.  The latest version of BIDS is really a VS2008 shell so I knew it wouldn't be as simple as installing Team Explorer.  I was sure I wasn't the first person to come across this issue so I pulled out my favorite research tool and did some research on the exact steps that need to be taken to enable Team Explorer for BIDS 2008.

I found several blog posts about the subject but this blog post by Joost van Rossum not only gave me the exact steps to Team Explorer in VS2008 so it would connect with TFS2010 but also was written from the perspective of using BIDS2008 and not just VS2008.  Here is an overview of the steps that Joost lined out in his blog post:
  1. Install Team Explorer 2008 (download link)
  2. Install SP1 for Visual Studio 2008 (download link)
  3. Install VSTS 2008 SP1 Forward Compatibility Update for TFS2010 (download link)
  4. Once you have all those pieces installed you have to create your server reference like this (http://<serverName>:<port>/<vdir>/<collectionName> (e.g. http://TfsServer:8080/tfs/ProjectCollectionName) which is different than you do in VS2010.

So I followed the above steps and thought everything was working great until I tried to log in to TFS using BIDS 2008 and received this error message popup:

Unable to switch servers at this time.  The Team Explorer is busy.


So I went back to my favorite research tool to do some more research and came across this blog post by Tom Hundley.

At the end of his post I discovered that my problem was with a single character!  So as soon as I removed the / on the end of my server url everything worked like a charm!

Thanks for the help Joost and Tom!

Saturday, June 30, 2012

_ViewStart.cshtml Info


In ASP.NET MVC3 “master pages” are handled in the _ViewStart.cshtml file.  As the name suggests the code in this file is executed before each view is rendered (see Scott Gu’s blog post (http://weblogs.asp.net/scottgu/archive/2010/10/22/asp-net-mvc-3-layouts.aspx) for more details).

Now that you understand the basic use of _ViewStart.cshtml file lets go over the scope applied to these files.  The _ViewStart.cshtml file will affect all views in the same directory and below the location of it.  Also you can have another _ViewStart.cshtml file under a sub-folder which will be executed after the top level _ViewStart.cshtml.  Using this feature you can in effect override the top level _ViewStart.cshtml with one closer to the view.

ViewStartExample





























Now when the Index.cshtml View under the Home folder is rendered, it will first execute the /Views/_ViewStart.cshtml file and then it will render the Index.cshtml View.

However, when the Index.cshtml View under the DifferentMasterPage folder is rendered, it will first execute the /Views/_ViewStart.cshtml file, then it will execute the /Views/DifferentMasterPage/_ViewStart.cshtml file, and then it will render the Index.cshtml View.