Enhanced the Graffiti Extras project

I came across the Graffiti extras project today and decided to modify the sharing portion of it.  I added DotNetKicks to the list of social apps and made some other minor updates as well. 

By far, the trickiest part for me was figuring out how to actually use the extension.  Other than that, everything went smooth and you can see that I’m using it currently for this site.

To use the extension, just add the following to your post.view (or whatever view you want).  Note: ~/__utility/img/sharing represents the directory where I placed the sharing images.

$sharing.Write($post, "~/__utility/img/sharing")

To only show DotNetKicks and say Digg, you can do so with:

$sharing.Write($post, "~/__utility/img/sharing", " ", "DotNetKicks,Digg It!")

 

Hope this helps someone

VisualSVN Server

Wow… well done!  I just installed VisualSVN Server and man… the team behind it did an excellent job.  If you’re looking into setting up a Subversion server in a Windows environment, but have held off because it involves Apache and a service traditionally installed in a *nix environment, look at VisualSVN Server. 

VisualSVN Server seamlessly installs and configures Apache and Subversion for you.  They also provide a handy GUI to manage security and repository information.  The install makes installing to a port other than port 80 as easy as typing in the desired port number.  So no worries if you want to run IIS on port 80 and have Apache on another port.

Upgraded to Graffiti

I’ve been working on moving a few of my websites to the Graffiti platform for a little while.  Migrating the old Bia Creations site from WordPress to Graffiti was straightforward and worked flawlessly.  By far, the most involved migration was this site, since there were a couple hundred posts that needed to imported from the Community Server database into the new Graffiti database.  One unfortunate side effect is that some of the later comments were not imported.

I ended up writing my own migrator because I had performance and connection issues with the one that ships with Graffiti.  I pruned out posts that I felt didn’t add much quality to the site and created a rather large redirect script to help ease the transition to the new urls.  If someone else is moving from Community Server to Graffiti and is interested in how I generated the isapi_rewrite rules for the redirects, I can share it.  It’s not pretty, but it worked great for me.

My experience so far has been extremely pleasant… I’m using commercial versions of Graffiti, so being able to use the integrated reporting is exciting to me.  Yeah, I’ve always had Google Analytics, but it’s nice to have the reporting built right into the admin section.  Overall, I’m happy with the themes for both this site and the Bia Creations site.  Graffiti makes it really easy to create a basic theme, which is more than I can say for Community Server 🙂

Other than that, as a compliment to the website upgrade, the web server got a nice upgrade as well.  The hardware was upgraded adding much needed redundancy and should provide a more stable environment for this website.  The OS was upgraded to Windows Server 2008 and Sql Server 2005.  MS did an excellent job with the latest server OS and I’m considering using it for a primary development environment instead of Vista.

Please let me know what you think and if you run into any issues.

Licensing component/library for dotnet

I’ve been quite busy with Property Center updates, another project that isn’t quite mature enough to be introduced, and consulting.  The unfortunate side effect of that is that my blog writing suffers. 

I have a question for you, though.  I am looking for a good .net licensing component.  If possible, I’d like to encrypt some information like name, software plan, etc with the license key.  I am fine with emailing a guid to the person who bought the license, have them input that guid into the program, and have the program grab the actual license file via the internet.  If it matters, the program will be an asp.net program.

If you have any suggestions for such a library, I’m all ears.  Please email me or add a comment to this post. 

How to enable pretty urls with Asp.Net MVC and IIS6

I’ve been working with the asp.net mvc bits lately and everything has been going smooth.  That is until I tried to deploy to a Windows 2003 server.  All of a sudden my pretty urls looked like crap.  When it comes to deploying an Asp.Net MVC app to IIS6, you have two options. 

  1. You can either setup a wildcard mapping
  2. You can add an isapi mapping and have an extension (.mvc) use the asp.net runtime.  

Both techniques have their good and bad points.  The wildcard mapping gives you nice looking urls, but all requests (including images, css, scripts, etc) go through the asp.net pipeline.  If performance is a concern, and generally it should be, you’ll want to avoid needlessly mapping all requests through the asp.net pipeline.  Alternatively, mapping just an extension to the asp.net runtime gives you the best performance but the urls are less desirable (Eg. http://localhost/Home.mvc/About).

My solution is to use Isapi_rewrite to transparently rewrite all nice urls to *.mvc file extension requests.  So, the user sees http://localhost/Home/About, while the webserver (after the isapi_rewrite filter has been applied) sees http://localhost/Home.mvc/About.  Isapi_rewrite is fast (written in c/c++), so you don’t have to worry about the performance, and your static files (scripts, images, etc) do not go through the pipeline.  Best of all, your users (and Google) see pretty urls.  It’s a win win setup, for IIS6.

On to the good part…

  1. Setup your .mvc extension map with IIS
    1. Open IIS Mangaement Console
    2. Expand your computer
    3. Expand Websites
    4. Right-click the website you’d like to edit (Most times it’ll be called "Default Web Site") and click Properties
    5. Click the Home Directory tab
    6. Click Configuration…
    7. Click Add
    8. Executable: c:windowsmicrosoft.netframeworkv2.0.50727aspnet_isapi.dll
    9. Extension: .mvc
    10. Verbs: Limit to: GET,HEAD,POST,DEBUG
    11. Un-check Verify that file exists
    12. Click Ok
  2. Install isapi_rewrite
  3. Put an .htaccess file in your website directory and edit as follows (make note of how I ignore the Content directory):
    RewriteEngine on
    RewriteRule ^Home(/)?$ $9 [NC,R=301]
    RewriteRule ^$ Home [NC]
    RewriteRule ^([w]+)$ $1.mvc [NC]
    RewriteRule ^(?!Content)([w]*)/(.*) $1.mvc/$2 [NC]
    
  4. Modify your routes to include the routes with and without the .mvc extension.  My routes look like:
    routes.IgnoreRoute("{resource}.axd/{*pathInfo}");
    routes.MapRoute(
    "Default",                                              // Route name
    "{controller}/{action}/{id}",                           // URL with parameters
    new { controller = "Home", action = "Index", id = "" }, // Parameter defaults
    new { controller = @"[^.]*" }                          // Parameter constraints - Do not allow dots in the controller name
    );
    routes.MapRoute(
    "Default.mvc",                                          // Route name
    "{controller}.mvc/{action}/{id}",                       // URL with parameters
    new { controller = "Home", action = "Index", id = "" }, // Parameter defaults
    new { controller = @"[^.]*" }                          // Parameter constraints - Do not allow dots in the controller name
    );
    

Using both routes with and without the .mvc extension gives you an added bonus.  The built in helper functions for generating routes use the first route found, so it will not generate urls with the .mvc extension.  The mvc framework does recognize and process both forms of urls, though.  So defining routes this way preserves debugging with cassini as well as deployment scenarios.

One other thing to note.  The "RewriteRule ^Home(/)?$ $9 [NC,R=301]" line in the .htaccess file is there to redirect requests to /Home and /Home/ back to the base url.  "RewriteRule ^$ Home [NC]" then redirects requests to the base url to the home controller (transparent to the user).  The final two lines transparently map controllers to urls with the .mvc extension, allowing the request to be processed by asp.net (via the IIS .mvc extension map setup on step 1).

Everything seems to be working great.  Hope this helps anyone in the same position.

** Updated for Preview 4

Cache Util and ServiceBase for SubSonic RepositoryRecord items

I have started working on a project in my “free” time and want to share some of the things I’ve been playing with.  For the new project, I’m planning on using Mvc, SubSonic, Linq, and jQuery to name a few technologies.  I’ll release more info about the project as it shapes into something more tangible.

As I mentioned, I’m using SubSonic with this project and more specifically, I’ve been using what will be SubSonic 2.1.  I’ve been quite happy with the work Rob Conery and his team have done with SubSonic.  For this project, I chose to go with the RepositoryRecord base class rather than ActiveRecord for my objects.  My reasoning was that I end up using services to interact with the objects anyway, so I might as well reduce the “weight” of the objects.  The services that I use add basic object caching as well as hide SubSonic integration.

In order to facilitate caching RepositoryRecord items, I had to rewrite my CacheUtil class slightly:

public class CacheUtil
{
public static List<T> FetchAll<T>() where T : RepositoryRecord<T>, new()
{
string key = typeof(T).ToString();
object item = HttpContext.Current.Cache[key];

if (item == null)
{
List<T> collection = DB.Select()
.From<T>()
.ExecuteTypedList<T>();

HttpContext.Current.Cache.Insert(key, collection);
return collection;
}
return (List<T>)item;
}

public static List<T> FetchAllAsc<T>(params string[] columns) where T : RepositoryRecord<T>, new()
{
string key = typeof(T).ToString() + "__SortAsc";
foreach (string column in columns)
{
key += "_" + column;
}
object item = HttpContext.Current.Cache[key];

if (item == null)
{
List<T> collection = DB.Select()
.From<T>()
.OrderAsc(columns)
.ExecuteTypedList<T>();

HttpContext.Current.Cache.Insert(key, collection);
return collection;
}
return (List<T>)item;
}


public static List<T> FetchAllDesc<T>(params string[] columns) where T : RepositoryRecord<T>, new()
{
string key = typeof(T).ToString() + "__SortDesc";
foreach (string column in columns)
{
key += "_" + column;
}
object item = HttpContext.Current.Cache[key];

if (item == null)
{
List<T> collection = DB.Select()
.From<T>()
.OrderDesc(columns)
.ExecuteTypedList<T>();

HttpContext.Current.Cache.Insert(key, collection);
return collection;
}
return (List<T>)item;
}

public static void ClearCache<T>()
{
string baseCacheKey = typeof(T).ToString();
if (HttpContext.Current.Cache[baseCacheKey] != null)
{
HttpContext.Current.Cache.Remove(baseCacheKey);
}

RemoveWithWildcards(baseCacheKey + "__SortAsc*");
RemoveWithWildcards(baseCacheKey + "__SortDesc*");
}

/// <summary>
/// Removes items from the cache using wildcards * and ?
/// </summary>
/// <param name="pattern">Pattern to search cache keys</param>
public static void RemoveWithWildcards(string pattern)
{
pattern = pattern.Replace("*", @"w*").Replace("?", @"w");
// Only match whole words
pattern = string.Format("{0}{1}{0}", @"b", pattern);
RemoveByPattern(pattern);
}

/// <summary>
/// Removes items from the cache based on the specified regular expression pattern
/// </summary>
/// <param name="pattern">Regular expression pattern to search cache keys</param>
public static void RemoveByPattern(string pattern)
{
IDictionaryEnumerator enumerator = HttpContext.Current.Cache.GetEnumerator();
Regex regex = new Regex(pattern, RegexOptions.Singleline | RegexOptions.Compiled | RegexOptions.IgnoreCase);
while (enumerator.MoveNext())
{
if (regex.IsMatch(enumerator.Key.ToString()))
{
HttpContext.Current.Cache.Remove(enumerator.Key.ToString());
}
}
}
}

 

I’m using the following base class for my services.  While it’s likely to change, I think it’s a pretty good place to start.

public abstract class ServiceBase<T> where T : RepositoryRecord<T>, new()
{
public static void Save(T item)
{
Save(item, "");
}

public static void Save(T item, string username)
{
DB.Save(item, username);
ClearCache();
}

public static void Delete(T item)
{
DB.Delete(item);
ClearCache();
}

public static void Destroy(T item)
{
DB.Destroy(item);
ClearCache();
}

public static T FetchById(object id)
{
List<T> items = FetchAll();
return items.Find(delegate(T item) { return item.GetPrimaryKeyValue().Equals(id); });
}

public static List<T> FetchAll()
{
return CacheUtil.FetchAll<T>();
}

public static List<T> FetchAllAsc(params string[] columns)
{
return CacheUtil.FetchAllAsc<T>(columns);
}

public static List<T> FetchAllDesc(params string[] columns)
{
return CacheUtil.FetchAllDesc<T>(columns);
}

public static void ClearCache()
{
CacheUtil.ClearCache<T>();
}
}

 

Goodbye Vault, Hello Subversion

After a few years of using Vault, I have completed moving all of my internal projects to Subversion.  Since I regularly work from multiple computers/locations and will most likely be adding people in the future, i wanted a free, flexible solution for source control.  Subversion was a breeze to setup and I’m very happy with how it isn’t integrated with Visual Studio.  I much prefer using TortoiseSvn to manage my project.  No longer will I have to wait for visual studio to connect to a repository when Internet access isn’t available.  It just makes life a little easier…

I know that I could’ve used Vault with out the Visual Studio integration, but that’s how it set itself up and I generally take the path of least resistance.  Truth be told… I had numerous instances where Vault failed to overwrite files with the latest from the repository.  Also, I know that Subversion has a companion called VisualSvn that allows you to use it inside Visual Studio.  I’ve realized that I’m not a big fan of integrating the IDE with source control.  You may be different, but that’s how I feel…

Anyway, I welcome the change to Subversion and I hope that it serves me as well as Vault has.   

VS.Net 2008 “Command Prompt Here” registry key

To complement a previous post for Visual Studio 2003 and 2005, this registry key will add a context menu item to windows explorer.  The menu item opens a command prompt with the vsvars32.bat from Visual Studio 2008.

 

Windows Registry Editor Version 5.00

[HKEY_CLASSES_ROOTDirectoryshellVisual Studio .NET 2008 Command Prompt]

[HKEY_CLASSES_ROOTDirectoryshellVisual Studio .NET 2008 Command Promptcommand]
@=”cmd.exe %1 /K “C:\Program Files\Microsoft Visual Studio 9.0\Common7\Tools\vsvars32.bat””

New website design for Property Center

I updated the Property Center website this weekend.  I worked with the Action Finale guys and they did an awesome job.  The result is super clean xhtml markup and css.  Hell, even the print view looks good.  Take a look at the new design, and let me know what you think.

One minor change I made was to use MooTools instead of Prototype.  While I do like Prototype, I couldn’t justify the 70k+ download for the functionality that I wanted to use.  I was able to use a slightly modified slimbox for the tour previews.  Moo, slimbox, and one other script all come in at less than 40k, which is fine for the few pages that require them.  I am still using prototype for the software portion of Property Center.

Does Google make you less skilled at problem solving?

Over the last few years, I’ve used Google to help me solve various problems, resulting in me not figuring them out on my own.  I’m not saying that I can’t solve problems, but Google is usually the first place I turn to for a solution before taking the time to manually do it. 
I feel that my problem solving skills haven’t been practiced nearly as much as my ability to work the search engine to return relevant results.  Think about it… how many times do you turn to Google to answer a
question that is on your mind?  Rather than figure it out, Google
provides the answer with little thought on your part.  When it comes
time to think for yourself in a situation, you’re less prepared to
solve the problem than if you had been practicing problem solving
skills all along.  So while I don’t dismiss the immediate benefits of
using Google as an answer, I do think we grow more lazy as its index becomes richer. 

This correlates with how cellphones change us as well. 
It might date me to say this, but in high school (and part of college), I
had memorized every phone number of everyone I directly knew.  Although now, I
doubt that I could readily tell you the cell phone numbers of close family members, off the
top of my head.

While I realize that Google (and the cellphone phone book) are tools for solving problems, it feels like cheating to me.  This probably all boils down to evolution.  Lets
just hope the tools we use on a daily basis are factual.  If not,
hopefully we can discern which results are
accurate with our modern tools.