Get jQuery intellisense in VS.Net when using a CDN

I recently heard about this technique to get jQuery intellisense working in Visual Studio .Net.  jQuery intellisense traditionally has never worked properly for me because I don’t use <head runat="server"> and thus don’t link to javascript files the MS way.  Most of the sites I build today just reference jQuery on a CDN like Google or Microsoft and that breaks Visual Studio’s ability to find the associated vsdoc.js file.  This works, however:

Add the following line to your master page file in the <head> element area under your normal jQuery script tag.

<% /* %><script type="text/javascript" src=""></script><% */ %>

That line wraps the script tag in comments, so the script tag never gets rendered on the client side.  Visual Studio sees a valid file, though, and provides intellisense based off the comments in that file.  Below is a full example page layout:

<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "">
<html xmlns="">
  <title>Example jQuery Intellisense</title>
  <script type="text/javascript" src=""></script>
  <% /* %><script type="text/javascript" src=""></script><% */ %>

  <!– Add your own javascript here –>

   Do work son


I heard about Balsamiq recently and figured I’d pass it along.  It’s a nice tool for quickly creating screen mockups.  After playing with it for about 5-10 minutes this morning, I’ve got to say that I’m impressed.  I definitely think it is a nice alternative to using powerpoint.  It is an Adobe AIR application which means that it’ll run on mac, windows, or linux.  They even allow you to test it out online, at their website.  Check it out!

Disclaimer: This is a bit of an ad, to get a free license, but look through this blog and show me where else I blog about other people’s products.  It’s not very often, so believe me when I say that this one is worth it!

IIS7: How to quickly and easily optimize your website using GZip compression

DmbStream is starting to gain some momentum and I want the site to be received as fast as possible. It has over 1,100 registered users now, so every little optimization helps.  I used YSlow to pinpoint some of the major issues with the site and it really shed some light on the bottlenecks.

The first thing I did was use Google to host jQuery. This is an obvious win… The more sites that use Google to host their ajax libraries, the greater the possibility that the user will already have that library in their browser cache. Plus it offloads about 60k of javascript to Google’s CDN for each virgin request.

After that, YSlow said that javascript files were not getting gzip compressed. I have DmbStream hosted with IIS7, so things *should* be easy to configure. After reading this article, I added the following to the <system.webServer> element in my web.config file:

    <remove fileextension=".js" />
    <mimemap mimetype="text/javascript" fileextension=".js" />

Finally, the html output needed some compression. Once again, IIS7 makes this pretty simple to configure once you find the magic elements to add to the web.config. This article gives a good overview of the elements to add to web.config while this article describes using iis7 dynamic compression with output caching.

For my needs, I just added the following to the web.config <system.webServer> element:

<urlcompression dodynamiccompression="true"></urlcompression>

So what are the results?

Empty browser cache: 123.2K
Primed browser cache: 48.5K

Enabling gzip and dynamic content caching:
Empty browser cache: 80.3K
Primed browser cache: 9.5K

That’s a reduction in size of 35-80% per request. Port80 says that these improvements speed the site up 6.1 times. Not too bad, for just adding a few lines to the a web.config.

I have some other tweaks that I’ll continue playing with (it looks like .gif files aren’t being compressed), but by far the most useful compression came from turning dynamic compression on. In other terms, compressing the generated HTML output.

If you’re looking for some more reading material regarding IIS7 compression, I recommend checking out this post as well.

Helper to access route parameters

I had a need to access routing information that was not readily accessible (as far as I could discover).  So, I wrote this helper to allow me to get the string, object pairs that Routing parses from the URL:

using System;
using System.Collections.Generic;
using System.Web;
using System.Web.Routing;

namespace BiaCreations.Helpers
    public class RouteHelper
        private static IDictionary<string, object> _values;
        public static IDictionary<string, object> GetRouteInfo(HttpContext context)
            if (_values == null)
                HttpContextBase contextBase = new HttpContextWrapper(context);
                RouteData data = RouteTable.Routes.GetRouteData(contextBase);

                RequestContext requestContext = new RequestContext(contextBase, data);

                _values = requestContext.RouteData.Values;
            return _values;

        public static T GetRouteInfo<T>(HttpContext context, string key)
            IDictionary<string, object> data = GetRouteInfo(context);

            if (data[key] == null)
                return default(T);

            object objValue = data[key];
            // It appears that route values are all strings, so convert the object to a string.
            if (typeof(T) == typeof(int))
                objValue = int.Parse(data[key].ToString());
            else if (typeof(T) == typeof(long))
                objValue = long.Parse(data[key].ToString());
            else if (typeof(T) == typeof(Guid))
                objValue = new Guid(data[key].ToString());
            return (T)objValue;

There are probably better ways to do this, but I needed this functionality and this works.  I am open to suggestions, though, if you have a better way of accomplishing this.  Oh, and my use case for needing this was that I needed value of the "id" parameter passed to a view, within an asp:substitution callback function.  I know that doesn’t completely follow the MVC philosophy, but you have to work with what you’re given, and sometimes it’s worth bending rules for the benefits that output caching can provide.

Migrate email from Gmail to Google Apps

I, among others, have searched for a solution to transfer email in my gmail account to my google apps email.  There isn’t a formal way of doing so via Google, but low and behold I stumbled across a way to do it with Linux!  Consider this an addendum to that post, with complete instructions for those not familiar with linux.  I wanted to keep all of the labels, stars, read status, and email date.  As an added bonus, this method allows you to change the recipient value on emails so that it shows that it came from “me” rather than your gmail address.  I used Amazon EC2 to work the magic for me and 46k emails later, I’m a happy google apps user 🙂    You can just as easily use your own linux box alternatively.

This is how you can transfer your email from Gmail to your Google Apps email:

  1. Log into Amazon EC2 and select a Fedora instance.  It doesn’t really matter which instance you use.  I used “Basic Fedora Core 8 (AMI ID: ami-5647a33f)”
  2. Follow the example video on Amazon’s website for how to SSH into your instance
  3. Log in as root
  4. Install imapsync by running “yum install imapsync”
  5. Edit a script by running “nano run-imapsync”
  6. Paste in the following:
    imapsync –host1
    –port1 993 –user1
    –passfile1 ./passfile1 –ssl1
    –port2 993 –user2
    –passfile2 ./passfile2 –ssl2
    –syncinternaldates –split1 100 –split2 100
    –authmech1 LOGIN –authmech2 LOGIN

    imapsync –host1
    –port1 993 –user1
    –passfile1 ./passfile1 –ssl1
    –port2 993 –user2
    –passfile2 ./passfile2 –ssl2
    –syncinternaldates –split1 100 –split2 100
    –authmech1 LOGIN –authmech2 LOGIN
    –regexmess ‘s/Delivered-To:’
    –regexmess ‘s/<>/<>/g’
    –regexmess ‘s/Subject:(s*)n/Subject: (no–subject)$1n/g’
    –regexmess ‘s/Subject: ([Rr][Ee]):(s*)n/Subject: $1: (no–subject)$2n/g’

    Replace with your Gmail address and with your Google Apps email address

  7. Press Control-x to save the file and quit nano
  8. Make the script executable by running “chmod 744 run-imapsync”
  9. Create a file containing your Gmail password by running “nano passfile1”
  10. Type in your Gmail password and press Control-x to save the file
  11. Create a file containing your Google Apps password by running “nano passfile2”
  12. Type in your Google Apps password and press Control-x to save the file
  13. Execute the script by typing “./run-imapsync”

Depending on the size of your mailbox, you’ll have nirvana in a few hours 🙂  Transfering my 46k emails weighing in around 2.5Gb took roughly about a day… I had to babysit the process because it failed after a while for some unknown reason.  But restarting it with the specified –maxage param will get you right back near where you left off.  You may notice that I call imapsync twice in my script file.  It was failing on messages that had multiple labels and the folders weren’t created yet.  So the first call creates all of the folders while the second call moves all of the messages.

Asp.Net MVC: Support both static and dynamic views

Lets try this again… the first time I wrote this post, my computer restarted and I lost it entirely 🙁

I have created a handful of sites that require both "static" pages and dynamic pages.  I put static in quotes because there is a dedicated page with unique content that can’t be created with a wysiwyg editor.  For example, show a list of the last 10 visitor ip addresses.  In the past, I’ve used a combination of webforms and isapi_rewrite rules to accomplish what I needed.  I’ve felt that the isapi_rewrite part was a little flaky, but it worked and isapi_rewrite is extremely fast.  Anyway, I’ve been using the Asp.Net MVC bits since preview 1 and have become fairly happy using it as an alternative to webforms.  I wanted to see how hard it was to allow for a situation like this with mvc.  Turns out, it’s pretty damn easy, and fairly clean (maintenance wise).

I tried a few approaches before finding this solution.  There may be other ways to accomplish this same method, but the following solution works.  I’m not going to publish all of the code, but I’ll go through the core concepts:

First, this method only supports single hierarchy url depths.  Meaning, the dynamic pages can have urls like /MyUrl, /my-url, or /this-is_valid.  They cannot (currently) have urls like: /my/url, blog/2008/my-blog-title.  Again, I doubt it would be hard to support cases like that, but I wanted to keep things simple for this explanation.  Onto the setup:

The controller setup is as follows:

  • HomeController
    • Index action – Consider this a "static" page with some custom content on it.  It has a view associated with it in the Views directory.
    • InvalidPage action – Used to show the 404 error
  • DynamicPageController
    • Index action – Only action and it takes a string parameter named id.  Id stands for the friendlyName of the dynamic page to show.  This action grabs the page information from the db and sends it to its related view, that just acts as a template for all of the dynamic pages.  Side note: I left the parameter named id so that I wouldn’t have to add an extra route to accommodate a more appropriate friendlyName parameter.  Minor, and largely irrelevant.

In order to have MVC attempt to locate dynamic pages, I had to create my own controller factory class.  I wanted the original logic of finding a controller, but if one was not found, I wanted send the logic to the DynamicPageController Index action.  This allows for the "static" pages to take precedence and then fallback to the dynamic page content.  The following code does just that:

public class BiaControllerFactory : DefaultControllerFactory
    public override IController CreateController(System.Web.Routing.RequestContext requestContext, string controllerName)
            return base.CreateController(requestContext, controllerName);
        catch (Exception ex)
            requestContext.RouteData.Values["id"] = controllerName;

            controllerName = "DynamicPage";
            requestContext.RouteData.Values["controller"] = controllerName;
            return base.CreateController(requestContext, controllerName);

It’s pretty straightforward.  The base.CreateController call will throw an exception when it can’t find the specified controller.  Since this example only allows for dynamic urls like /my-dynamic-page, the default routes interpret that as a my-dynamic-page controller, index action.  Easy enough to set the correct RouteData values and send the logic to the DynamicPageController Index action, with the correctly specified id parameter.

If the DynamicPageController Index action doesn’t find the dynamic page specified by the id parameter, it’ll redirect to the HomeController InvalidPage action.  As a bonus, I setup the InvalidPage action to send the proper 404 response code back to the client.

To have MVC use the new controller factory class, add the following in the same place that you define your routes (Usually Global.asax):

ControllerBuilder.Current.SetControllerFactory(new BiaControllerFactory());

That’s basically it… I plan to use this method for smaller sites that require some custom coded pages as well as simple CMS capabilities.  Let me know if you have any questions, comments, or suggestions related to this.

Vino 100

Over the weekend, I had the pleasure of going to Vino 100.  It’s a small wine shop in Wauwatosa, WI (they also have one in the 3rd Ward).  I should begin with a bit of background on my wine knowledge, though.. While I’m not a huge wine drinker, I do enjoy a glass now and then.  Half priced wino wednesdays have really brought out my inner wino and I’ve been drinking wine more frequently lately.  I just don’t have a lot of knowledge when it comes to relating wine names with flavors. 

Vino 100 takes a lot of the unknowns out of choosing wine.  The store is laid out with red wines along one wall and whites along another, with tasting tables and a small bar occupying the space between walls.  The genius is how they’ve organized the wines, though.  The wines are organized by taste rather than name.  For the white wine wall, they start with fruity, sweet wines on the left and gradually move to dry, full bodied wines on the right.  Each wine has a description and a visual representation for the wine’s flavor and body.  If the wine is on their menu, you can even sample the wine before making a purchase.  The red wine wall is organized in much the same way.

For me, they really hit it out of the park.  Instead of catering to the wine connoisseur, they cater to novices like myself and really make it easy to improve wine knowledge without an unexpected flavor or taste.

Updated SubSonic 3 Templates (Version 1)

As I mentioned in my previous post, I am playing around with SubSonic 3.  While the templates that come with it are a good starting point, I made some changes to make things feel a little more like the SubSonic I know and love…

You can download my updated SubSonic templates here.

What have I changed?

  1. I added a template called that contains settings used across all templates
  2. You can now specify a namespace that is different from the db provider name
  3. I added StripTableText and StripSPText, allowing you to remove table & sproc prefixes
  4. I moved ExcludeTables to a central location
  5. I removed redundant assembly and import directives
  6. Pluralized Queries in
  7. Public fields were converted to automatic properties

This wont be the last update to the templates, but I think it’s a step in the right direction…

Visual Studio Website Project: Add context menu for T4 files

Hey all,

I was excited to get going with SubSonic 3, but soon realized that website projects do not support T4 files (Text Template Transformation Toolkit).  I found that there is a command-line tool that you can use to generate the output, but I wanted something even easier.  So I set out to create a right-click context menu item to generate the code files.  This is one way for how you can do it.

  1. Add an external tool command to the command-line tool

    1. Click Tools -> External Tools…
    2. Click Add

      Title: Generate T4 Code
      Command: C:Program FilesCommon FilesMicrosoft SharedTextTemplating1.2TextTransform.exe
      Arguments: $(ItemFileName)$(ItemExt) -out $(ItemFileName).cs
      Initial Directory: $(ItemDir)

    3.  Check "Use Output Window" and click Ok
  2. Add it to the website project context menu
    1. Click View -> Toolbars -> Customize…
    2. Check Context Menus

    3.  With the Customize dialog still open, click the Tools menu
    4. While holding Ctrl, left click and drag your External Tool command to the Project and Solution Context Menus toolbar, which opens allowing you to continue dragging the external command it to Web Item

    5. Click the Project and Solution Context Menus toolbar, Click Web Item, and Right Click External Command
    6. Change the Name to Generate T4 Code and click enter

    7. Close the Customize dialog
  3. Now you have a right click context menu to generate code for .tt files

There is an annoyance with this method, however.  The context menu item will show up regardless of file type/extension.  If someone knows how to fix this, or has a better way of adding context menu items, I’m all ears.  I did find this article for how to add context menu items with VSIP, but that seemed a bit overkill…