Category Archives: Uncategorized

Archiving CMS type data using SubSonic

Rob Conery asked that we share some of the things that we’ve done with the new Subsonic, in celebration of its 2.1 release.  I’ve been using various svn versions of Subsonic for some time now, and feel that I have a decent grasp of its new features. 

One thing that seems to keep coming up is the need to implement an archive table for CMS type data.  So I whipped up some generic utility functions, that make use of the new query engine api, to handle the archive process automatically.  If the latest archive object does not match the object specified, these helpers will automatically rip through all the columns of the object and set the values for a new archive type.  To use this properly, you’ll want to have the same column names in the archive table that you do in your original table.

There are a few assumptions with these utility helpers:

  1. The archive table needs a column CreatedOn that is a datetime – probably could be modified to sort by primary column, but I don’t think this is a major issue
  2. Your model needs to be based on RepositoryRecord – could be modified easily to support ActiveRecord

Hope this helps and if you use this, please include a note for where you found it…

 

To make use of the magic, you would do something like:

DB.Save(somePost, User.Identity.Name); // Save the post as normal
ArchiveHelper.CheckAndCreateArchive<Post, PostArchive>(somePost, User.Identity.Name)

 

And now the helper methods (I put in a sealed class called ArchiveHelper)…

/// <summary>
/// Checks to see if changes have been made and if so, creates an archive.
/// </summary>
/// <typeparam name="T">Type of the item to archive</typeparam>
/// <typeparam name="TArchiveType">The type of the archive type.</typeparam>
/// <param name="item">The latest item</param>
/// <param name="modifiedBy">Username doing the modifications</param>
public static void CheckAndCreateArchive<T, TArchiveType>(T item, string modifiedBy) where T : IRecordBase where TArchiveType : RepositoryRecord<TArchiveType>, IRecordBase, new()
{
    if (HasChanged<T, TArchiveType>(item))
    {
        TArchiveType archive = CreateArchive<T, TArchiveType>(item);
        DB.Save(archive, modifiedBy);
    }
}

/// <summary>
/// Determines whether the specified item has changed since its last archive.
/// </summary>
/// <typeparam name="T">Type of the item to archive</typeparam>
/// <typeparam name="TArchiveType">The type of the archive type.</typeparam>
/// <param name="item">The latest item</param>
/// <returns>
///     <c>true</c> if the specified item has changed; otherwise, <c>false</c>.
/// </returns>
public static bool HasChanged<T, TArchiveType>(T item) where T : IRecordBase where TArchiveType : RecordBase<TArchiveType>, IRecordBase, new()
{
    TableSchema.Table itemSchema = item.GetSchema();

    SqlQuery latestArchiveQuery = DB.Select().Top("(1)").From<TArchiveType>().Where(itemSchema.PrimaryKey.ColumnName).IsEqualTo(item.GetPrimaryKeyValue()).OrderDesc("CreatedOn");
    List<TArchiveType> archives = latestArchiveQuery.ExecuteTypedList<TArchiveType>();

    if (archives.Count == 0)
        return true;

    TArchiveType latestArchive = archives[0];
    TableSchema.Table archiveSchema = latestArchive.GetSchema();

    foreach (SubSonic.TableSchema.TableColumn column in itemSchema.Columns)
    {
        if (IsReservedColumnName(column.ColumnName))
            continue;

        bool containsColumn = false;
        foreach (SubSonic.TableSchema.TableColumn c in archiveSchema.Columns)
        {
            if (c.ColumnName.Equals(column.ColumnName, StringComparison.OrdinalIgnoreCase))
            {
                containsColumn = true;
                break;
            }
        }

        if (containsColumn)
        {
            if (!item.GetColumnValue(column.ColumnName).Equals(latestArchive.GetColumnValue(column.ColumnName)))
                return true;
        }
    }
    return false;
}

/// <summary>
/// Creates an archive of the specified item.
/// </summary>
/// <typeparam name="T">Type of the item to archive</typeparam>
/// <typeparam name="TArchiveType">The type of the archive type.</typeparam>
/// <param name="item">The item used to create the archive</param>
/// <returns></returns>
public static TArchiveType CreateArchive<T, TArchiveType>(T item) where T : IRecordBase where TArchiveType : IRecordBase, new()
{
    TableSchema.Table itemSchema = item.GetSchema();

    TArchiveType archive = new TArchiveType();
    TableSchema.Table archiveSchema = archive.GetSchema();

    foreach (SubSonic.TableSchema.TableColumn column in itemSchema.Columns)
    {
        if (IsReservedColumnName(column.ColumnName))
            continue;

        bool containsColumn = false;
        foreach (SubSonic.TableSchema.TableColumn c in archiveSchema.Columns)
        {
            if (c.ColumnName.Equals(column.ColumnName, StringComparison.OrdinalIgnoreCase))
            {
                containsColumn = true;
                break;
            }
        }

        if (containsColumn)
        {
            archive.SetColumnValue(column.ColumnName, item.GetColumnValue(column.ColumnName));
        }
    }

    return archive;
}

private static bool IsReservedColumnName(string column)
{
    return (Utility.IsAuditField(column) || Utility.IsLogicalDeleteColumn(column));
}

Iterate asp.net form validators client side

I added a client side onclick function to a submit button, but wanted to verify that the form validators were all valid before performing my magic.  Unfortunately, you can’t change when the validator logic gets fired, so my onclick function always gets processed before the form validation logic.  Luckily, Microsoft provides a few client side helper methods/properties that allow you to work with the form validators.

You can enable/disable validators via the ValidatorEnable(validator, enable) function, which seems like it would be useful for complicated forms…  But for me, I wanted to iterate the validators, have them perform their logic, and then determine if any of them were invalid.  The following code block will do just that:

var isValid = true;
if (Page_Validators) {
for (var i = 0; i < Page_Validators.length; i++) {
ValidatorValidate(Page_Validators[i]);
isValid = Page_Validators[i].isvalid;
if (!isValid)
break;
}
}

 

To iterate the validators and have them display their message in a validation summary, you’d want to do something like:

var isValid = true;
if (Page_Validators) {
for (var i = 0; i < Page_Validators.length; i++) {
ValidatorValidate(Page_Validators[i]);
isValid = Page_Validators[i].isvalid;
if (isValid)
isValid = Page_Validators[i].isvalid;
}
Page_IsValid = isValid;
ValidationSummaryOnSubmit();
}

 

Hope this helps someone…

Canada 2008

Around the first week of June, I was able to relax in Ontario, Canada with my brother, dad, dad’s friend Larry, and Molly.  It was nice to get away for a few days and not be concerned about email, tv, and cell phones.  One of the cool things about going up there this time of year is that it remains bright from 5am-11pm.  that really makes the guilt of taking a nap pretty much non existent. 🙂  If you’re interested, I posted a few of the pictures here.

 

Fix for GroupWise MAPI Session Handle issue

I recently paved a new machine and had to put GroupWise on it.  To make a long story short, the machine got into a state where I would receive a message like "The MAPI Session Handle of the Novell GroupWise ‘Object Request broker’ could not be accessed. The address book cannot run" every time I tried to compose an email.

A lot of searching turned up very little help.  Novel had a craptastic answer regarding this, suggesting that you uninstall everything including Outlook and then reinstall.  But this issue is related to the MAPI client not being installed properly.  You don’t have to reinstall GroupWise like other sites suggest, rather just install the MAPI client from Microsoft.  After I installed the MAPI client, Groupwise works as it should…

The benefits of using Git for source control

I was listening to the Hansel Minutes presentation about Git tonight.  Overall, it’s a very cool overview talk about Git and how it relates to development and existing source control systems.  A situation regarding local development came up during the talk that I’ve personally run into it on pretty much every project… 

With traditional source control (Subversion, cvs, vault, sourcesafe, etc), you checkin your modifications when they’re stable enough for other people on your team to consume.  Git adds the notion of a local repository.  So rather than having one central repository that everyone shares, each user has their own personal repository.  You can do all the checkins/checkouts/modifications/branches with your local repository, without affecting other developers on your team.  Then, similarly to traditional source control, you can push your changes to the peers (or primary peer – think of this as a stable repository on the build server) when things are stable.

To me, this is huge.  I can’t tell you how many times rewritten something, only to go back to the original method later…. or worse, lost changes.  With Git, you can checkin your changes anytime you feel like it and not worry about screwing up your coworker’s experience.  You get the benefits of source control, without the headaches of breaking the build for everyone else.

The guys that Hanselman interviewed from Planet Argon have their primary peer hooked up to Subversion.  So they can use Git locally, push their changes to the primary peer, which in turn commits those changes to Subversion. 

Enhanced the Graffiti Extras project

I came across the Graffiti extras project today and decided to modify the sharing portion of it.  I added DotNetKicks to the list of social apps and made some other minor updates as well. 

By far, the trickiest part for me was figuring out how to actually use the extension.  Other than that, everything went smooth and you can see that I’m using it currently for this site.

To use the extension, just add the following to your post.view (or whatever view you want).  Note: ~/__utility/img/sharing represents the directory where I placed the sharing images.

$sharing.Write($post, "~/__utility/img/sharing")

To only show DotNetKicks and say Digg, you can do so with:

$sharing.Write($post, "~/__utility/img/sharing", "&nbsp;", "DotNetKicks,Digg It!")

 

Hope this helps someone

VisualSVN Server

Wow… well done!  I just installed VisualSVN Server and man… the team behind it did an excellent job.  If you’re looking into setting up a Subversion server in a Windows environment, but have held off because it involves Apache and a service traditionally installed in a *nix environment, look at VisualSVN Server. 

VisualSVN Server seamlessly installs and configures Apache and Subversion for you.  They also provide a handy GUI to manage security and repository information.  The install makes installing to a port other than port 80 as easy as typing in the desired port number.  So no worries if you want to run IIS on port 80 and have Apache on another port.

Upgraded to Graffiti

I’ve been working on moving a few of my websites to the Graffiti platform for a little while.  Migrating the old Bia Creations site from WordPress to Graffiti was straightforward and worked flawlessly.  By far, the most involved migration was this site, since there were a couple hundred posts that needed to imported from the Community Server database into the new Graffiti database.  One unfortunate side effect is that some of the later comments were not imported.

I ended up writing my own migrator because I had performance and connection issues with the one that ships with Graffiti.  I pruned out posts that I felt didn’t add much quality to the site and created a rather large redirect script to help ease the transition to the new urls.  If someone else is moving from Community Server to Graffiti and is interested in how I generated the isapi_rewrite rules for the redirects, I can share it.  It’s not pretty, but it worked great for me.

My experience so far has been extremely pleasant… I’m using commercial versions of Graffiti, so being able to use the integrated reporting is exciting to me.  Yeah, I’ve always had Google Analytics, but it’s nice to have the reporting built right into the admin section.  Overall, I’m happy with the themes for both this site and the Bia Creations site.  Graffiti makes it really easy to create a basic theme, which is more than I can say for Community Server 🙂

Other than that, as a compliment to the website upgrade, the web server got a nice upgrade as well.  The hardware was upgraded adding much needed redundancy and should provide a more stable environment for this website.  The OS was upgraded to Windows Server 2008 and Sql Server 2005.  MS did an excellent job with the latest server OS and I’m considering using it for a primary development environment instead of Vista.

Please let me know what you think and if you run into any issues.

Licensing component/library for dotnet

I’ve been quite busy with Property Center updates, another project that isn’t quite mature enough to be introduced, and consulting.  The unfortunate side effect of that is that my blog writing suffers. 

I have a question for you, though.  I am looking for a good .net licensing component.  If possible, I’d like to encrypt some information like name, software plan, etc with the license key.  I am fine with emailing a guid to the person who bought the license, have them input that guid into the program, and have the program grab the actual license file via the internet.  If it matters, the program will be an asp.net program.

If you have any suggestions for such a library, I’m all ears.  Please email me or add a comment to this post. 

How to enable pretty urls with Asp.Net MVC and IIS6

I’ve been working with the asp.net mvc bits lately and everything has been going smooth.  That is until I tried to deploy to a Windows 2003 server.  All of a sudden my pretty urls looked like crap.  When it comes to deploying an Asp.Net MVC app to IIS6, you have two options. 

  1. You can either setup a wildcard mapping
  2. You can add an isapi mapping and have an extension (.mvc) use the asp.net runtime.  

Both techniques have their good and bad points.  The wildcard mapping gives you nice looking urls, but all requests (including images, css, scripts, etc) go through the asp.net pipeline.  If performance is a concern, and generally it should be, you’ll want to avoid needlessly mapping all requests through the asp.net pipeline.  Alternatively, mapping just an extension to the asp.net runtime gives you the best performance but the urls are less desirable (Eg. http://localhost/Home.mvc/About).

My solution is to use Isapi_rewrite to transparently rewrite all nice urls to *.mvc file extension requests.  So, the user sees http://localhost/Home/About, while the webserver (after the isapi_rewrite filter has been applied) sees http://localhost/Home.mvc/About.  Isapi_rewrite is fast (written in c/c++), so you don’t have to worry about the performance, and your static files (scripts, images, etc) do not go through the pipeline.  Best of all, your users (and Google) see pretty urls.  It’s a win win setup, for IIS6.

On to the good part…

  1. Setup your .mvc extension map with IIS
    1. Open IIS Mangaement Console
    2. Expand your computer
    3. Expand Websites
    4. Right-click the website you’d like to edit (Most times it’ll be called "Default Web Site") and click Properties
    5. Click the Home Directory tab
    6. Click Configuration…
    7. Click Add
    8. Executable: c:windowsmicrosoft.netframeworkv2.0.50727aspnet_isapi.dll
    9. Extension: .mvc
    10. Verbs: Limit to: GET,HEAD,POST,DEBUG
    11. Un-check Verify that file exists
    12. Click Ok
  2. Install isapi_rewrite
  3. Put an .htaccess file in your website directory and edit as follows (make note of how I ignore the Content directory):
    RewriteEngine on
    RewriteRule ^Home(/)?$ $9 [NC,R=301]
    RewriteRule ^$ Home [NC]
    RewriteRule ^([w]+)$ $1.mvc [NC]
    RewriteRule ^(?!Content)([w]*)/(.*) $1.mvc/$2 [NC]
    
  4. Modify your routes to include the routes with and without the .mvc extension.  My routes look like:
    routes.IgnoreRoute("{resource}.axd/{*pathInfo}");
    routes.MapRoute(
    "Default",                                              // Route name
    "{controller}/{action}/{id}",                           // URL with parameters
    new { controller = "Home", action = "Index", id = "" }, // Parameter defaults
    new { controller = @"[^.]*" }                          // Parameter constraints - Do not allow dots in the controller name
    );
    routes.MapRoute(
    "Default.mvc",                                          // Route name
    "{controller}.mvc/{action}/{id}",                       // URL with parameters
    new { controller = "Home", action = "Index", id = "" }, // Parameter defaults
    new { controller = @"[^.]*" }                          // Parameter constraints - Do not allow dots in the controller name
    );
    

Using both routes with and without the .mvc extension gives you an added bonus.  The built in helper functions for generating routes use the first route found, so it will not generate urls with the .mvc extension.  The mvc framework does recognize and process both forms of urls, though.  So defining routes this way preserves debugging with cassini as well as deployment scenarios.

One other thing to note.  The "RewriteRule ^Home(/)?$ $9 [NC,R=301]" line in the .htaccess file is there to redirect requests to /Home and /Home/ back to the base url.  "RewriteRule ^$ Home [NC]" then redirects requests to the base url to the home controller (transparent to the user).  The final two lines transparently map controllers to urls with the .mvc extension, allowing the request to be processed by asp.net (via the IIS .mvc extension map setup on step 1).

Everything seems to be working great.  Hope this helps anyone in the same position.

** Updated for Preview 4