All posts by Jim Geurts

New website: DMBStream.com

Those who know me, probably know that I enjoy listening to Dave Matthews.  Over the weekend, I built a site that allows me to listen to some live shows wherever I may be.  It’s nice to not have to lug around a usb drive full of music.  The site is http://dmbstream.com with a main purpose of allowing you to stream bootlegs of dmb shows.  It’s like Pandora, but allows you to rewind, fast forward and listen to entire concerts as many times as you like.

It was a blast to create and allows me to "play" with new technologies that might be harder to implement in other projects.  I must say that I am still pretty amazed at how quickly it all came together.  I had the core framework built over the weekend (including css and photoshop hacking).  Search, comments, and photo integration were added shortly afterward.  There’s still a lot that I want to do with the site, but I feel that it’s in a good enough place for people to go and play with it. 

I’d like to hear what you think, so if you have any questions or comments please let me know.

Disqus comments

Though it’s a trend with some bloggers right now, I decided to pass on setting up Disqus comments with this blog, for now.  I really like the idea and implementation, but my concerns are what swayed me from installing it:

  1. The comments are not stored with my blog
  2. When disqus.com goes down, I lose comments
  3. SEO – Comments add to SEO for the website, but if it’s javascript injected then I lose the SEO aspect.

Maybe if I get complaints I’ll change my mind, but for now I’m going to hold off…

Create and delete cookies in an iframe with IE

Internet Explorer shines yet again… I had to write a login sync for two sites (Single sign-on).  So that you can log into one site and (behind the scenes) get logged into the other site.  I chose to do this with iframes and all was happy.  Until I had to verify that it worked in IE.  IE chooses to dismiss cookies set/removed through the iframe.  Setting a header on the page that creates/deletes the cookies will remedy this, though. 

Add this to your pages that create/delete the cookies:

HttpContext.Current.Response.AddHeader("p3p", "CP="IDC DSP COR ADM DEVi TAIi PSA PSD IVAi IVDi CONi HIS OUR IND CNT"");

It creates a p3p "compact policy" that gets sent to the client. This convinces IE to accept the cookie in some magical way…

Via articles on Asp.Net Resources and SalesForce

BinaryResult for Asp.Net MVC

I was working with the latest drop of the asp.net mvc framework and had a need to stream a pdf back to the user… A quick glance around the System.Web.Mvc namespace only yielded ContentResult.  While that’s cool for string content, I needed something more flexible for binary data.  So I wrote the BinaryResult:

public class BinaryResult : ActionResult
{
    public byte[] Data { get; set; }
    public bool IsAttachment { get; set; }
    public string FileName { get; set; }
    public string ContentType { get; set; }

    public override void ExecuteResult(ControllerContext context)
    {
        context.HttpContext.Response.Clear();
        context.HttpContext.Response.ContentType = ContentType;
        if (!string.IsNullOrEmpty(FileName))
        {
            context.HttpContext.Response.AddHeader("content-disposition",
                ((IsAttachment) ? "attachment;filename=" : "inline;filename=") +
                FileName);
        }
        context.HttpContext.Response.BinaryWrite(Data);
    }
}

You can use this in your controllers to stream any type of binary data as the ActionResult for that controller.  So for my example, I call it like:

public ActionResult Download()
{
    Doc doc = new Doc();
    doc.Read(Server.MapPath("~/Sample.pdf"));
    using (MemoryStream stream = new MemoryStream())
    {
        doc.Save(stream);

        return new BinaryResult
                   {
                       ContentType = "application/pdf",
                       FileName = "Sample.pdf",
                       IsAttachment = true,
                       Data = stream.ToArray()
                   };
    }
}

btw, sorry if I’m reinventing the wheel with this, but Google gave no love when I tried searching for it

Complex SQL conditional statements with SubSonic 2.1

Have you needed to do a complex where clause with a query, but didn’t want to write raw SQL?  That’s where SubSonic expression constraints come in handy.  Say you want to do something like:

select * from Product where IsActive = 1 and (ExpiredOn is null OR ExpiredOn <= ’01/01/2020′)

Using SubSonic 2.1, you can do the following:

DateTime futureDate = new DateTime(2020, 1, 1);

List<Product> products = DB.Select().From<Product>()
.Where(Product.Columns.IsActive).IsEqualTo(true)
.AndExpression(Product.Columns.ExpiredOn).IsNull().Or(Product.Columns.ExpiredOn).IsLessThanOrEqualTo(futureDate).CloseExpression()
.ExecuteTypedList<Product>();

Basically, the AndExpression part translates to a SQL "and" operator followed by a beginning parentheses.  The CloseExpression translates to the closing parentheses.  Theoretically, you could nest these bad boys as deep as you’d like.  Note: SubSonic also comes with an OrExpression as well.

 

Hope this helps

Archiving CMS type data using SubSonic

Rob Conery asked that we share some of the things that we’ve done with the new Subsonic, in celebration of its 2.1 release.  I’ve been using various svn versions of Subsonic for some time now, and feel that I have a decent grasp of its new features. 

One thing that seems to keep coming up is the need to implement an archive table for CMS type data.  So I whipped up some generic utility functions, that make use of the new query engine api, to handle the archive process automatically.  If the latest archive object does not match the object specified, these helpers will automatically rip through all the columns of the object and set the values for a new archive type.  To use this properly, you’ll want to have the same column names in the archive table that you do in your original table.

There are a few assumptions with these utility helpers:

  1. The archive table needs a column CreatedOn that is a datetime – probably could be modified to sort by primary column, but I don’t think this is a major issue
  2. Your model needs to be based on RepositoryRecord – could be modified easily to support ActiveRecord

Hope this helps and if you use this, please include a note for where you found it…

 

To make use of the magic, you would do something like:

DB.Save(somePost, User.Identity.Name); // Save the post as normal
ArchiveHelper.CheckAndCreateArchive<Post, PostArchive>(somePost, User.Identity.Name)

 

And now the helper methods (I put in a sealed class called ArchiveHelper)…

/// <summary>
/// Checks to see if changes have been made and if so, creates an archive.
/// </summary>
/// <typeparam name="T">Type of the item to archive</typeparam>
/// <typeparam name="TArchiveType">The type of the archive type.</typeparam>
/// <param name="item">The latest item</param>
/// <param name="modifiedBy">Username doing the modifications</param>
public static void CheckAndCreateArchive<T, TArchiveType>(T item, string modifiedBy) where T : IRecordBase where TArchiveType : RepositoryRecord<TArchiveType>, IRecordBase, new()
{
    if (HasChanged<T, TArchiveType>(item))
    {
        TArchiveType archive = CreateArchive<T, TArchiveType>(item);
        DB.Save(archive, modifiedBy);
    }
}

/// <summary>
/// Determines whether the specified item has changed since its last archive.
/// </summary>
/// <typeparam name="T">Type of the item to archive</typeparam>
/// <typeparam name="TArchiveType">The type of the archive type.</typeparam>
/// <param name="item">The latest item</param>
/// <returns>
///     <c>true</c> if the specified item has changed; otherwise, <c>false</c>.
/// </returns>
public static bool HasChanged<T, TArchiveType>(T item) where T : IRecordBase where TArchiveType : RecordBase<TArchiveType>, IRecordBase, new()
{
    TableSchema.Table itemSchema = item.GetSchema();

    SqlQuery latestArchiveQuery = DB.Select().Top("(1)").From<TArchiveType>().Where(itemSchema.PrimaryKey.ColumnName).IsEqualTo(item.GetPrimaryKeyValue()).OrderDesc("CreatedOn");
    List<TArchiveType> archives = latestArchiveQuery.ExecuteTypedList<TArchiveType>();

    if (archives.Count == 0)
        return true;

    TArchiveType latestArchive = archives[0];
    TableSchema.Table archiveSchema = latestArchive.GetSchema();

    foreach (SubSonic.TableSchema.TableColumn column in itemSchema.Columns)
    {
        if (IsReservedColumnName(column.ColumnName))
            continue;

        bool containsColumn = false;
        foreach (SubSonic.TableSchema.TableColumn c in archiveSchema.Columns)
        {
            if (c.ColumnName.Equals(column.ColumnName, StringComparison.OrdinalIgnoreCase))
            {
                containsColumn = true;
                break;
            }
        }

        if (containsColumn)
        {
            if (!item.GetColumnValue(column.ColumnName).Equals(latestArchive.GetColumnValue(column.ColumnName)))
                return true;
        }
    }
    return false;
}

/// <summary>
/// Creates an archive of the specified item.
/// </summary>
/// <typeparam name="T">Type of the item to archive</typeparam>
/// <typeparam name="TArchiveType">The type of the archive type.</typeparam>
/// <param name="item">The item used to create the archive</param>
/// <returns></returns>
public static TArchiveType CreateArchive<T, TArchiveType>(T item) where T : IRecordBase where TArchiveType : IRecordBase, new()
{
    TableSchema.Table itemSchema = item.GetSchema();

    TArchiveType archive = new TArchiveType();
    TableSchema.Table archiveSchema = archive.GetSchema();

    foreach (SubSonic.TableSchema.TableColumn column in itemSchema.Columns)
    {
        if (IsReservedColumnName(column.ColumnName))
            continue;

        bool containsColumn = false;
        foreach (SubSonic.TableSchema.TableColumn c in archiveSchema.Columns)
        {
            if (c.ColumnName.Equals(column.ColumnName, StringComparison.OrdinalIgnoreCase))
            {
                containsColumn = true;
                break;
            }
        }

        if (containsColumn)
        {
            archive.SetColumnValue(column.ColumnName, item.GetColumnValue(column.ColumnName));
        }
    }

    return archive;
}

private static bool IsReservedColumnName(string column)
{
    return (Utility.IsAuditField(column) || Utility.IsLogicalDeleteColumn(column));
}

Iterate asp.net form validators client side

I added a client side onclick function to a submit button, but wanted to verify that the form validators were all valid before performing my magic.  Unfortunately, you can’t change when the validator logic gets fired, so my onclick function always gets processed before the form validation logic.  Luckily, Microsoft provides a few client side helper methods/properties that allow you to work with the form validators.

You can enable/disable validators via the ValidatorEnable(validator, enable) function, which seems like it would be useful for complicated forms…  But for me, I wanted to iterate the validators, have them perform their logic, and then determine if any of them were invalid.  The following code block will do just that:

var isValid = true;
if (Page_Validators) {
for (var i = 0; i < Page_Validators.length; i++) {
ValidatorValidate(Page_Validators[i]);
isValid = Page_Validators[i].isvalid;
if (!isValid)
break;
}
}

 

To iterate the validators and have them display their message in a validation summary, you’d want to do something like:

var isValid = true;
if (Page_Validators) {
for (var i = 0; i < Page_Validators.length; i++) {
ValidatorValidate(Page_Validators[i]);
isValid = Page_Validators[i].isvalid;
if (isValid)
isValid = Page_Validators[i].isvalid;
}
Page_IsValid = isValid;
ValidationSummaryOnSubmit();
}

 

Hope this helps someone…

Canada 2008

Around the first week of June, I was able to relax in Ontario, Canada with my brother, dad, dad’s friend Larry, and Molly.  It was nice to get away for a few days and not be concerned about email, tv, and cell phones.  One of the cool things about going up there this time of year is that it remains bright from 5am-11pm.  that really makes the guilt of taking a nap pretty much non existent. 🙂  If you’re interested, I posted a few of the pictures here.

 

Fix for GroupWise MAPI Session Handle issue

I recently paved a new machine and had to put GroupWise on it.  To make a long story short, the machine got into a state where I would receive a message like "The MAPI Session Handle of the Novell GroupWise ‘Object Request broker’ could not be accessed. The address book cannot run" every time I tried to compose an email.

A lot of searching turned up very little help.  Novel had a craptastic answer regarding this, suggesting that you uninstall everything including Outlook and then reinstall.  But this issue is related to the MAPI client not being installed properly.  You don’t have to reinstall GroupWise like other sites suggest, rather just install the MAPI client from Microsoft.  After I installed the MAPI client, Groupwise works as it should…

The benefits of using Git for source control

I was listening to the Hansel Minutes presentation about Git tonight.  Overall, it’s a very cool overview talk about Git and how it relates to development and existing source control systems.  A situation regarding local development came up during the talk that I’ve personally run into it on pretty much every project… 

With traditional source control (Subversion, cvs, vault, sourcesafe, etc), you checkin your modifications when they’re stable enough for other people on your team to consume.  Git adds the notion of a local repository.  So rather than having one central repository that everyone shares, each user has their own personal repository.  You can do all the checkins/checkouts/modifications/branches with your local repository, without affecting other developers on your team.  Then, similarly to traditional source control, you can push your changes to the peers (or primary peer – think of this as a stable repository on the build server) when things are stable.

To me, this is huge.  I can’t tell you how many times rewritten something, only to go back to the original method later…. or worse, lost changes.  With Git, you can checkin your changes anytime you feel like it and not worry about screwing up your coworker’s experience.  You get the benefits of source control, without the headaches of breaking the build for everyone else.

The guys that Hanselman interviewed from Planet Argon have their primary peer hooked up to Subversion.  So they can use Git locally, push their changes to the primary peer, which in turn commits those changes to Subversion.