Tuesday, April 29, 2008

Running Multiple .NET Services within a Single Process

I love the fact that .NET makes it profoundly easy to write Windows Services. Most of the low level details have been abstracted away, and while this makes it easier to write and deploy services, sometimes it doesn't work the way you'd expect. For example, I noticed something odd when I tried to write a Service that hosted multiple services. According to the API, it is possible to provide multiple ServiceBase objects to the static Run method as an array:

public static void Main()
{
 ServiceBase[] ServicesToRun = new ServiceBase[] { new Service1(), new Service2() };
 ServiceBase.Run(ServicesToRun);
}

However, when my code executes, only the first ServiceBase object runs, which seems suspicious. The culprit is that the API is somewhat misleading -- the ServiceBase.Run method is not actually responsible for running your services. Instead, it loads them into memory and then hands them off to the Service Control Manager for activation. The service that gets activated is the service you requested when you activated it from the Services Applet or command line:

NET START Service1

This error has appeared in many different forums, but no one seems to post a working example, so maybe I'm not entirely alone on this one. I think part of the confusion stems from the fact that I can give the first ServiceBase object in the array any ServiceName that I wish and it will execute.

public static void Main()
{
 ServiceBase myService = new Service1();
 MyService.ServiceName = "ServiceA";
 ServiceBase.Run(myService);
}

How to make it work:

The correct way to allow multiple services to run within in a single process requires the following:

  1. An installer class with the RunInstaller attribute set to True. The class is instantiated and invoked when you run InstallUtil.exe
  2. The installer class must contain one ProcessInstaller instance. This object is responsible for defining the operating conditions (Start-up mode and User) that your service application will run under.
  3. The installer class must contain one ServiceInstaller instance per ServiceBase in your application. If you plan on running multiple services, each one must (sadly) be installed prior to use.
  4. For the service that you anticipate being started from the Services Applet, list the other services in the ServicesDependedOn property so that they will be started when your service starts:
[RunInstaller(true)]
public class MyServiceInstaller : Installer
{
 public MyServiceInstaller()
 {
     ServiceProcessInstaller processInstaller = new ServiceProcessInstaller();
     processInstaller.Account = ServiceAccount.LocalSystem;
  
     ServiceInstaller mainServiceInstaller = new ServiceInstaller();
     mainServiceInstaller.ServiceName = "Service1";
     mainServiceInstaller.Description = "Service One";
     mainServiceInstaller.ServicesDependedOn = new string [] { "Service2" };
  
     ServiceInstaller secondServiceInstaller = new ServiceInstaller();
     secondServiceInstaller.ServiceName = "Service2";
     secondServiceInstaller.Description = "Service Two";
  
     Installers.Add(processInstaller);
     Installers.Add(mainServiceInstaller);
     Installers.Add(secondaryServiceInstaller);
 }
}

Now when Service1 starts, Service2 is also started. Happily, both services log to the same log4net file and the number of Processes in the Task Manager increments only by one.

Note that when Service2 is stopped, Service1 will also be stopped. However, shutting down Service2 will not stop Service1. If you want tighter coupling between the two services, you might consider adding ServiceController logic to Service1 to start and stop Service2 during the Service1 OnStart and OnStop methods... maybe something I'll follow up with a later post.

submit to reddit

Tuesday, April 22, 2008

ASP.NET Uri Fragment is not available

Recently, a question came my way about filtering URLs that contain fragment-identifiers. A fragment-identifier goes by many different names (bookmark, pound, hash, named-anchor, etc) and is represented as a pound symbol (#) at the end of a querystring:

http://server/path?query#fragment-identifier.

Unfortunately, I had looked into something similar only a few months previously, so my response came immediately: "this cannot be done." While researching a problem several months ago, I was surprised to learn that the fragment of the URL is a client-side only html tag, meaning that most modern browsers use this primarily to scroll the named element into view -- they do not transmit this information to the web server. A simple test shows this value is NEVER populated.

public partial class PageTest : System.Web.UI.Page
{
    protected void Page_Load(object sender, EventArgs e)
    {
        Response.Write("Fragment = " + Request.Url.Fragment + "<br />");
    }
}

Sadly, this is not ASP.NET specific. It's part of the Uri specification. A Wikipedia article on this topic suggests that you can pass "#" to the server if it is encoded as %23, although this value is treated as part of the querystring instead of being interpreted as the Uri fragment.

If you need these values in the URL, put them in the query-string.

submit to reddit

Friday, April 18, 2008

Visual Studio "Format the whole document" for Notepad++ using Tidy

I've started playing with Notepad++ over the last year, and really liking it. If you've been living under a rock, its an opensource replacement for the boring windows notepad.exe and has appeared on top-ten lists, including Scott Hanselman's list Ultimate Developer and Power User Tool List. While I haven't completely replaced Visual Studio, I have found a few neat tricks that have saved me a lot of grief.

My recent favorite is Tidy, an open source tool that can format HTML output, is included as a TextFX plugin for Notepad++.  By default, it doesn't do much, but the magic starts when you drop a configuration file into the TextFX plugin folder. Here's how I've configured mine:

  1. Navigate to C:\Program Files\Notepad++\plugins\NPPTextFX
  2. Create a text file named htmltidy.cfg and place the following contents inside:
    indent: auto
    indent-spaces: 2
    wrap: 72
    markup: yes
    input-xml: yes
  3. Enjoy!

The configuration above is a basic format, which automatically wraps and indents XML/XHTML files nicely. To use just load up your XML file, choose "TextFX -> TextFX HTML Tidy -> Tidy" and your document should automatically indent properly. If you need more options, check out the Tidy quick reference guide. If you format a lot of XML documents, you can speed things up by assigning a ShortCut key:

  1. Choose "Settings -> Shortcut Mapper"
  2. Click on the "Plugin Commands" and scroll down to entry "D:Tidy" (entry 241 on my system).
  3. Double click the item and assign a ShortCut key.

A quick aside on the Shortcut keys, I had to try a few different options until Tidy formatted my document. I suspect that Notepad++ doesn't detect duplicate Shortcuts. I settled with CTRL+ALT+K, which seems to work without issue.

Lastly, if you want to completely replace "notepad.exe" with "Notepad++", there's a neat replacement utility referenced on the Notepad++ site that you should download and follow their basic instructions. Note that this utility is not the same as renaming notepad++.exe to notepad.exe and dropping it in your Windows directory; it's a utility that looks up the location of notepad++.exe from the registry and forwards requests to it. Also note, if your machine shipped with a copy of the operating system (typically a i386 folder), you need to replace the original notepad.exe there as well.

submit to reddit

Thursday, April 17, 2008

Selenium 0.92 doesn't work in IE over VPN or Dialup

I've been writing user interface tests for my current web project using Selenium. I really dig the fact that it uses JavaScript to manipulate the browser. I'm working on building a Language Pattern, where my unit tests read like a simple domain language -- it involves distilling Selenese output into a set of reusable classes.

I ran into a really frustrating snag during a late-night coding session and I started to freak out a bit -- my Selenium tests just magically stopped working! Instead of getting the Selenium framed window, my site was serving 404 messages. At first I thought the plumbing code that I had written was somehow serving the wrong URL.

I quickly switched my tests to FireFox and was relieved to see them working fine -- and under the same URL. Since my client uses IE 6, dropping Internet Explorer support for UI tests would be a deal breaker. I was surprised to see the tests work when I switched the URL from localhost:80 to localhost:4444, which is the port Selenium's proxy server runs on. The light in my head started to glow...

The aha moment came when I switched back to FireFox: I noticed that none of my FireFox plugins were loaded and that the Proxy server setting had been enabled to route localhost:80 through localhost:4444. Selenium is controlling registry settings for the browser, meaning that some setting was missing in IE. Although Internet Explorer had been configured to use my Selenium proxy-server settings, it ignores these values when on dial-up and VPN connections. You need to specify a different proxy server through an Advanced settings menu.

Both FireFox and Internet explorer use PAC files, which are used to automatically detect the configuration settings for your proxy server. Selenium generates a new PAC file between executions, so you'll quickly find that manually fixing it becomes a pain. To fix, create your own pac file and wire the setting in yourself.

Here's a snap of my connections dialog:

And the contents of my selenium-proxy.pac file:

function FindProxyForURL(url, host) {
        return 'PROXY localhost:4444; DIRECT';
}

submit to reddit

Tuesday, April 15, 2008

Danger: Visitor Design Pattern can be useful

In seems that in my circles that out of all the design patterns in the gang of four, the Visitor pattern is often seen as confusing and impractical. I'd agree with that assessment: patterns like the Command, Strategy, Composite, and Factory are commonly used because it's easy to think of examples that work. Whereas the Visitor Pattern has a confusing relationship between objects and requires a lot of upfront code to make it work. It's easily filed under the i-don't-think-i'll-ever-use-this category.

I recently found a great code example on Haibo Luo's blog that involved using reflection to read IL (using Method.GetMethodBodyAsIL()). In it he posts two very different approaches to parsing the IL, the first post shows a Reflector-like example of a IL-Reader; the second post is focused on a related side-project but outlines how he was able to use the Visitor pattern to allow different interpretations of the IL. The Visitor pattern is perfect here, because IL is based on a fixed specification that will never change. (A side note: the entire ILReader class is attached as a zip at the bottom of the post and is worth checking out if you're interested in parsing IL using Reflection.)

After showing this example to a few colleagues (with some heated debates), I found new appreciation for the Visitor pattern. The Visitor Pattern can be really useful anywhere you have a fixed set of data, which surprisingly happens more frequently than you might think.

Take "Application Configuration" as an example. Normally, I'd write a simple Parser to read through the configuration to construct application state. Since the configuration elements are a fixed object model, they can be easily modified to accept a visit from a visitor:

public interface IConfigVisitor
{
void Visit(MyConfig configSection);
void Visit(Type1 dataElement);
void Visit(Type2 dataElement);
}

public interface IConfigVisitorAcceptor
{
void Accept(IConfigVisitor visitor);
}

public class MyConfig : ConfigurationSection, IConfigVisitorAcceptor
{
// config stuff here, omitted

public void Accept(IConfigVistior visitor)
{
visitor.Accept(this);
}
}

public class Type1 : ConfigurationElement, IConfigVisitorAcceptor
{
// config stuff here, omitted

// example fields for Type1
public string Field1;

public void Accept(IConfigVisitor visitor)
{
visitor.Visit(this);
}
}

public class Type2 : ConfigurationElement, IConfigVisitorAcceptor
{
// config stuff here, omitted

// example fields for Type2
public string Field1;
public string Field2;

public void Accept(IConfigVisitor visitor)
{
visitor.Visit(this);
}
}

Little modification needs to be done to the parser to act as a Visitor. The parser is simply a visitor that collects state as it travels to each configuration element. This example is a bit trivial:


public class ConfigurationParserVisitor : IConfigVisitor
{
// example internal state for visitor
StringBuilder example = new StringBuilder();

public void Visit(MyConfig configSection)
{
// a custom iterator could be used here to simplify this
foreach(Type1 item in configSection.Type1Collection)
{
item.Accept(this);
}
foreach(Type2 item in configSection.Type2Collection)
{
item.Accept(this);
}
}

public void Visit(Type1 data)
{
example.AppendLine(data.Field1);
}
public void Visit(Type2 data)
{
example.AppendLine(data.Field1 + " " + data.Field2);
}

public string GetOutput()
{
return example.ToString();
}
}

public class Example
{
public static void Main()
{
MyConfig config = (MyConfig)ConfigurationManager.GetSection("myconfig");

ConfigurationParserVisitor parser = new ConfigurationParserVisitor();
config.Accept(parser);

Console.WriteLine(parser.GetOutput());
}
}

Here's usually where the argument gets heated: Why would anyone do this? Wouldn't you be better off writing a parser that accepts your configuration element as a parameter? A very valid question, it does seem an obtuse direction to follow if you only need to read your configuration file. However, where the Visitor pattern becomes useful is that new functionality can be added to the configuration elements without having to modify the object model in any way. Perhaps you want to auto-upgrade your settings to a new version, produce a report, display your configuration in a UI, etc.

One of the subtle advantages to this pattern is that new functionality can be expressed in a single class rather than spread about the solution. This makes it perfect fit for adding plugins to your application, or building an application that is composited together with a Command pattern.

While not all application will require this level of flexibility, it can be a very useful pattern when you need it. The upfront cost is a one-time event, so it's a pretty easy refactoring exercise.

submit to reddit

Friday, April 11, 2008

Debugging HitBox Page Attributes

After spending a few hours debugging HitBox page attributes by using "View Source" and digging through the HTML markup, I whipped this together: Drag this link to your browser's bookmark toolbar: Show HitBox Inside this hyperlink:

javascript:alert('Page name= ' + hbx.pn + '\nPage Category=' + hbx.mlc);

Simple enough. I wonder if this approach would work for WebTrends or Omniture as well. If anyone has any examples, post a link.

Update 6-19-08: WebTrends link also available.

Tuesday, March 25, 2008

Google Outlook Sync - Wait

When I learned that Google had an Outlook Sync feature for Google Calendar, I was quick to download and experiment with it. While I was able to sync successfully between both Google and Outlook, I did notice some quirks that others have also reported:

  • Calendar items appear in your Deleted Items folder in Outlook
  • Outlook is shutdown during sync.

Of course, the forums have some nasty comments in them, my favorite draws the analogy between Google Calendar Sync and Windows ME. I'm going to hold off on this version (0.9.3) until some of these issues are resolved.

Tuesday, March 04, 2008

Redirect Standard Output of a Service to log4net

I recently wrote a simple windows service that hosted batch files and other applications within a service process. I found some great stuff located here, which really helped me along.

Like many other developers, I quickly discovered that debugging and diagnosing issues wasn't particularily easy. On my machine, it was fairly simple to set a break point and manually attach to the service, but diagnosing issues on other machines lacked detail in the Event Log. What I needed was a way to capture the output of my hosted application.

As I was already using log4net to trace through application flow, I used the following approach to redirect the output of my hosted application into my logger.


using System.Diagnostics;
using log4net;

public class MyService : ServiceBase
{
private static readonly ILog log = LogManager.GetLogger(typeof(MyService));

private Process process;

public override void OnStart(string[] args)
{
process = new Process();

ProcessStartInfo info = new ProcessStartInfo();

// configure the command-line app.
info.FileName = "java.exe"
info.WorkingDirectory = "c:\program files\Selenium\RemoteControlServer"
info.Arguments = "-jar selenium-server.jar"

// configure runtime specifics
info.UseShellExecute = false; // needed to redirect output
info.RedirectStandardOutput = true;
info.RedirectErrorOutput = true;

process.StartInfo = info;

// setup event handlers
process.EnableRaisingEvents = true;
process.ErrorDataReceived += new DataReceivedEventHandler(process_ErrorDataReceived);
process.OutputDataReceived += new DataReceivedEventHandler(process_OutputDataReceived);


process.Start();

// notify process about asynchronous reads
process.BeginErrorReadLine();
process.BeginOutputReadLine();

}

// fires whenever errors output is produced
private static void process_ErrorDataReceived(object sender, DataReceivedEventArgs e)
{
try
{
if (!String.IsNullOrEmpty(e.Data))
{
log.Warn(e.Data);
}
}
catch(Exception ex)
{
log.Error("Error occurred while trying to log console errors.", ex)
}
}

// fires whenever standard output is produced
private static void process_OutputDataReceived(object sender, DataReceivedEventArgs e)
{
try
{
if (!String.IsNullOrEmpty(e.Data)
{
log.Debug(e.Data);
}
}
catch(Exception ex)
{
log.Error("Error occurred while trying to log console output.", ex);
}
}
}

submit to reddit

Saturday, March 01, 2008

Twelve Days of Code - Wrap up

Over the Christmas break, I started a series of posts about Visual Studio templates. I did some additional work, but never posted that information. I've got some new ideas for some code that I want to experiment and blog about and I may be afforded some additional time at work to pursue related issues, so hopefully I'll find the time to post about those.

Anyway, to clear out the cob-webs, here's the last segment in that series...

My experiment with the Multi-Item Template was to create two projects, create references between them, reference third-party assemblies and create a default class implementation. The biggest challenge and limitation to overcome is that the user supplied settings in the Multi-Item templates are not shared with each of the Project-Templates, resulting in the Project-Templates reverting to their default values. Very frustrating indeed. (Visual Studio 2008 apparently fixes this)

The best way to really manipulate the projects in a solution is to obtain a reference to the Project through the COM Interop libraries for Visual Studio. Once you have a reference to the project, you can fix the project name, add references, etc.

To overcome the limitations of the Multi-Item template, I used WizardExtensions in both the Multi-Item template and the Project-Templates, then used a singleton to carry settings between the projects.

This approached works, but seems pretty complicated, especially dealing with the COM Interop for Visual Studio. I still would need to strong-name the assemblies and write an installer that installs the templates and the assemblies.

It'll interesting to contrast this to the Guidance Automation Toolkit.

Monday, December 31, 2007

Google Search Results

Friday, December 21, 2007

Debugging WizardExtensions for Visual Studio Templates

As per my previous post, this exercise would probably be much easier if I used the Guidance Automation Toolkit, but in the spirit of Twelve Days of Code, I promised to boldly venture into areas I normally don't go. I decided that I wanted to try out a WizardExtension so that I could compare the experience with the Guidance Automation Toolkit. So I created a new project and added the following references:

  • EnvDTE
  • ENvDTE80
  • Microsoft.VisualStudio.TemplateWizardInterface
  • System.Windows.Forms

The Visual Studio Template documentation says you need to sign your assembly and install it into the GAC, but that's crazy. Rather than jumping through hoops, I found a handy forum post that describes that the Visual Studio follows the standard assembly probing sequence, so the assembly needs to be in a place that the devenv.exe process can find it. Signing and installing into the GAC is simply a security measure. I didn't want to dump my custom assembly into Visual Studio's assemblies (where I would forget about it) so I created a custom folder in %program files%\Microsoft Visual Studio 8\Common7\IDE and added this to the probingPath of the devenv.exe.config. To enable debugging for my custom wizard-extension, I use two Visual Studio instances. One for my wizard-extension, the other for testing the template. Here are the steps involved:

  • Add the assembly and class name to your ProjectGroup.vstemplate file:
<wizardextension>
 <assembly>Experiments.TemplateWizard</assembly>
 <fullclassname>Experiments.TemplateWizard.CustomizeProjectNameWizard</fullclassname>
</wizardextension>
  • Zip up the updated template and copy it into the appropriate Visual Studio Templates folder.
  • Compile the wizard-extension assembly and copy it and its pdb to a path where visual studio can find it
  • Launch a new instance of Visual Studio
  • Switch back to the other visual studio instance, attach to the "devenv" process (the one that says it's at the start page) and set your break-points
  • Switch back to the new instance of Visual Studio and start the template that contains your wizard extension
  • debugging goodness!!

Well, at least I saved myself the effort of signing, etc. This exercise showed that very little is actually done at the ProjectGroup level of a Multi-Project template: the RunStarted is called, followed by ProjectFinishedGenerating method. The biggest disappointment is that the project parameter in the ProjectFinishedGenerating is null. This is probably because the item being created is a Solution, not a project.

The last ditch (seriously, ditch!) is to cast the automationObject passed into RunStarted to _DTE, and the work through COM interop to manage the Solution. That sounds romantic.

submit to reddit

Thursday, December 20, 2007

Bundling Visual Studio templates for distribution

Microsoft's done a fairly good job in packaging for Visual Studio templates. Simply:

  1. Create an xml file that adheres to the Visual Studio Content Installer Schema Reference
  2. Rename the xml with a "vscontent" extension
  3. Place the vscontent file and your template zip into another zip file
  4. Rename that zip file with a "VSI" extension.

Now, when you double click the VSI file it runs a wizard that installs your template into the appropriate Visual Studio Template folder.

submit to reddit

Wednesday, December 19, 2007

Visual Studio 2005 Multi-Project Templates - a waste of time?

As part of my twelve-days-of-code, I'm tackling a set of small projects geared towards simple project automation. I've discovered in recent projects that although the road is always paved with good intentions, other tasks, emergencies and distractions always prevent you from accomplishing what seem to be the most minor tasks. So when starting out on a new task, we always cut corners with the intention of returning to these trivial tasks whenever we find the time, or when they become justified in our client's eyes. However, if we started out with these things done for us, no one would question their existence or worry about a catch-up penalty; we would just accept these things as best-practice.

Visual Studio Project templates are interesting, though my first encounters with them suggest they miss the mark. For my projects, I find the effort isn't about creating the project, it's about creating the overall solution: project libraries, web sites, test harnesses, references to third-party libaries and tools, build-scripts, etc. Visual Studio supports the concept of "Multi-Projects Templates", which are limited (see below), but I suspect that the Guidance Automation Extensions might fill in the gaps.

Visual Studio supports two types of templates within the IDE, and a third type which must be stitched together using XML. The first type refers to "Item Templates" which refer to single files which can be included in any project. I'm focusing more on Project templates and Multi-Project templates.

Within Visual Studio, the concept of a project template is extremely easy: you simply create the project the way you like and then choose the "Export Templates..." option from the File menu. The project and its contents are published as a ZIP file in "My Documents\Visual Studio 2005\My Exported Templates". A big plus on the template structure is that all the files support parameterization, which means you can decorate the exported files with keywords that are dynamically replaced when the template is created by the user. The export wizard takes care of most of the keyword substitution for you, such that root namespaces in all files will match the name of the user's solution. With this in mind, a Project Template is "sanitized" and waiting for your client to adopt your structure with their name.

Multi-Project Templates stitch multiple Project Templates together by using a master xml-based template file. These templates can't be created using the IDE, but you can create a solution and then export each project out as a Project Template, then following this handy MSDN article and the Template Schema reference, you can quickly parcel together a master template.

However, there's a few really nasty limitations with multi-item projects templates. The biggest issue is that the Project Name cannot be parameterized, so the template adopts the names that are defined in your configuration file. As a result, the only thing you can really customize is the name of the solution. I was completely baffled by this: I thought I must be doing something wrong. However, after a few minutes of googling, others had come to the exact same conclusion.

Fortunately, the template system supports a Wizard framework, which would allow you to write some code to dynamically modify the solution. Unfortunately, the code for this would have to be strong-named and installed in the GAC. I'm tempted to wade into this, but I fear that I might be better off looking at the Guidance Automation Toolkit.

submit to reddit

Monday, December 17, 2007

Visual Studio Templates - Export Template not available

So I was started into my foray of the Twelve Days of Code, focusing on a "Software Automation" theme. First stop, "Visual Studio Project templates". I've played with these templates before, and because Visual Studio makes them considerably easy to do, this first stop should be an easy one. I had an interesting problem with Visual Studio that slowed me down.

The steps to create a template are straight forward: you create the item and then use the "Export Template" wizard in the "File" menu. However, the "Export Template..." option did not appear in the File menu.

I recently got a new laptop, and had to reinstall everything from scratch. At first I thought it was because IT only installed Visual Studio Professional instead of the Enterprise version. But there have been some other peculiarities, for example, the "Debug" tool bar was missing crucial items like "Step In", and "Step Out".

The culprit was that I had installed SQL Server 2005 after Visual Studio. Because they share the same shell (sounds a lot like the Component UI Application Block) SQL Server had changed the default settings for Visual Studio.

To fix:

  1. Select "Import and Export Settings..." from the Tools menu.
  2. Choose the option to "Reset all Settings."
  3. Choose the option to save your current settings, just in case.
  4. Pick an option that doesn't include "SQL Server". I chose the "Visual C# Development Settings"
  5. Finish

Perhaps the software-automation tip here is to configure your settings with defaults for your team, export them out and share the vssettings file (xml) the team.

Sunday, December 16, 2007

The Twelve Days of Code

One of the things I always have enjoyed over Christmas holidays is finding a few hours here and there for myself to explore new technolgies or tinker with side-projects. With all the events and distractions, finding time for yourself can be a pretty challenging task. This year, I'm thankful to have a lot of time off, so I've decided the best way to tackle this is to find projects that are interesting but don't require a huge investment of thought. So I'm not going to bother with cracking open the XAML specifications and trying a SilverLight project. Too much time. Too much thought.

So I've come up with the "Twelve Days of Code". Catchy, but not corny. Here's the rules:

  • Tackle a small project each day
  • Use technologies I haven't used before or wouldn't naturally use in the course of a work project
  • No project should take more than an hour.
  • If you get snagged, the hour is about the snag.
  • All the projects have to share a common theme
  • The projects need to provide value to me beyond the scope of the holidays.
  • Keep 'em blog worthy

I've decided my common theme should be "Software Automation" which is anything that saves time for repeated tasks. If I can save other developers on my team time, also great.

I've started a small list of to-do's, and I'm using http://www.rememberthemilk.com to help me with the scheduling, etc.

The list of thing I want to play with are:

  • Visual Studio Templates
  • Custom Installers / Wizards
  • Click Once deployment

I'll keep ya posted.

Thursday, July 05, 2007

Technorati

This last few months I've been playing with all sorts of web 2.0 tools: - TadaLists - Twitter - Jaiku - wakoopa - Flickr - Del.icio.us - and others I just started Technorati, and I'm using this blog post to "claim" by blog. (Technorati Profile)

Tuesday, July 03, 2007

Simpsons Avatars


me and lori
Originally uploaded by bryanbcook
I spent a good chunk of time with Lori on Thursday night building avatars on the Simpson's Movie website.

The web site is entirely flash, and they have a concept where your avatar appears in the site in various locations. It's a neat idea. Although it appears that most of the site is in "coming soon" mode, which reaks-of web-agency poor planning and scope creep challenges (or a series of well designed strategies...)

Unless I missed it in the UI, the site doesn't provide an ability to export your avatar as an image though it really should! (This image was taken from a screen capture and manipulated in MSPaint) I learned of the site and it's avatar abilities through the viral aspect at work: everyone was making them and posting them to facebook.

I had a lot of fun putting the avatars together and I guess I'm more stoked about the movie than I was before. Since the "coming-soon" aspect didn't capture my attention (the content has to be amazing for that attraction to work) I probably won't be returning to the site anytime soon. My work group is very web-saavy, but I wonder how many other individuals are hacking the site in this nature? If the attraction is the avatar, they should be milking that and allowing users to send them to friends, etc....

Try it out for yourself: http://www.simpsonsmovie.com

Friday, June 01, 2007

Karma bottleneck

I can't believe how buried I've been recently. Couped up. Recoiled. Tucked away.

I think things went south at the beginning of March.

I went away on a business trip for a full week. The flight down wasn't direct, so it was broken up into two small uneventful flights which would make a normal 90 minute direct flight about 6 hours long. On the second leg of the trip, I was tucked in the very back of the plane (did I mention my employer was paying for the trip, we seem to always get the crappiest seats at the last minute deals). On the decent, I was listening to a podcast on my first iPod. The whole podcast experience was significant because my iPod was still very new and you know how us guys get with our new toys, we fall into a ritual with them that seems to define who we are at that time. While they eventually lose their appeal and just become elements of the grind, my iPod was different: It's prototypical of this decade, an extension of yourself in musical form. In that sense, I was very caught up in it.

The odd thing is -- somehow, iPods crash planes -- or at least that's how the flight attendants make it seem, because they want you to turn it off while they descend. I comply, but keep my cherished iPod close at hand by winding up the headphones and setting it into my lap. After a few long solitary seconds, I switch to the book I brought in my carry on.

When we land, it's six hours into my 90 minute flight, and in my haste, I quickly pack up and get off the plane. I'm just worn out. It's not until I'm in the rental car 20 minutes later that I think about resuming my podcast. Since I'm in a strange city on my own in the middle of the night, my attention is focused on getting to the hotel, but in the back of my mind I can't place what I did with my iPod: it's either in my bag or in my jacket even though I don't recall putting it away.

When I arrive at the hotel, I check my jacket pockets as I walk toward the front lobby. It's strange that it's not there, so it must be in the carry on though that seems really "off" somehow. In the back of my mind, I begin to imagine what my wife would do to me when I tell her I lost the Christmas gift she bought me. Chuckle. Let's hope not.

I check-in.
I jump in the elevator. I pat down my carry on.
Yeah, that's not, ...uh, ...not good.

I get to my room and I quickly empty my pockets and check my carry on.
I open all the bizarre pockets that I never use.
I empty the carry-on and shake it.

FaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaK!!!

Yeah, screwed.

I call the airport, and get through to Lost and Found. My flight was the last flight for the night. My iPod is probably sitting on the floor, on the seat, or tucked between the seats of the 9:35pm Delta flight from Cincinnati to Richmond. Although it's only been 40 minutes from the flight, it's gone. Translation: "The white one with no scratches or identifiable marks? Yeah, it's going for $15 on eBay right now"

Somehow, this event has caused a ripple effect in what I envisioned this year to be. The choices between carrying-cases that my wife wanted, the iPod dock in the living room, the new iMac that would house my iTunes library -- all of these decisions, interrupted and sidelined indefinitely.

Instead of moving forward, I seemed plagued with indecisions of either buying a new one, or finding a cheaper one, or buying a stolen one from eBay. All these crazy thoughts happening simultaneously like a neuron misfire causing bottlenecks all the way down the line. In the event, an extension of myself was lost, and it was as though I accepted defeat and stopped making decisions. Coupled with the onslaught of spring-cleaning and never-ending hunny-do-lists, it was though I shut down.

Earlier this week, I managed to get away from the grind. I bought an iPod replacement. I bought an car-charger with FM unit and carrying case. And suddenly, just as summer arrived, so did I.

Sunday, April 29, 2007

Impressed with Google's manage History feature

It's been a few months since my last post -- I got sidetracked off my NCover tutorial I was trying a few months back. (Incidentally, my project desperately needs NCover, so I'll finally get a chance to jump back into that game).

Ages ago, I relied exclusively on the Google Toolbar in Internet Explorer. But when I realized that big-brother was analyzing every move I made, I got a bit creep'd out so I uninstalled the bugger. The heavy influence of FireFox and it's built-in search toolbar (CTRL+K) made enough impact that I've never looked back.

However, a lot of things have changed in the Internet landscape as well as my personal lifestyle in the last few years: Google Analytics and GMail have rolled into my digital life and even this blog now uses my Google Account. I've come to know of Google's "Do no evil" motto, which has relaxed some of my concerns that gradually, behind the scenes, they are collecting my information without requiring me to install software. Now that I'm using my Google Account for so many things, all my FireFox searches in the last 15 months (bless'ed CTRL+K) have quietly been captured. Although I've seen my searches appearing on my personalized homepage, I haven't been that concerned: at least I know what Google knows about me.

Today I noticed Google's new "Manage History" feature, which required me to install the Google Toolbar before using it. Although I installed the toolbar with some reluctance, I am wildly surprised with the manage history feature. Not only can I view my search history, but I can also selectively edit any item -- it's my data!

Google tells me that I've done waay too many searches, and it tells me what I clicked on per search. It provides trend information on my usage: monthly, by day, by hour. (FYI: I do more searches in January, Thursdays are most popular, and I've never searched for anything between 1 and 5 am) It can tell me the top ten search queries in the last week, month, year or all time.

I can view my history in terms of web, news, images, sponsored links -- even Google Maps and Video. Even scarier: it can offer links that I might be interested in based on my previous usage.

What also seems interesting is that Google is providing bookmark functionality -- a competing feature to online bookmark services such as del.icio.us.

What would be really interesting is to be able to cross reference this history to my activity elsewhere -- the topics in my inbox, the tasks or projects I was working on at that time.

Also -- if Google could have tracked my history using my google-account then why did I need the Toolbar in order to use the history feature?

Tuesday, February 13, 2007

NCover Setup - Part II

Following up on my previous post, I'm continuing to set up code coverage in my .NET 2.0 project with some specific criteria.

Kudos to Grant Drake (aka Kiwidude) who is very committed to the cause. I spent some time on his site last night, pouring over some of his FAQ. He left a comment on my blog earlier, citing an FAQ that I missed on his site. Plus he recommended version 1.5.4 of NCover instead of the latest 1.5.5beta. Good stuff!

So to follow through on my previous post.

The first thing I did was move my test namespace and their classes into a separate ClassLibrary. This included all the standard stuff:

  • remove the reference of NUnit from the Core library
  • add an NUnit reference to the Test library.
  • add a Core Library reference to the Test project.

After a quick compile, I dropped down to a command prompt and executed the same statement, except this time I had to specify the name of the Test harness assembly:

ncover.console nunit-console Test.dll //a Core

This worked and ...satisifies my second objective! The added bonus is now the coverage report only contains information about the Core namespace. This is great, but as an aside, since nothing has changed in the code I wonder if this is a bug in the "//a" switch? Update: The //a switch is for assemblies without their extensions, not namespaces.

As a further refinement, I can begin to take advantage of NUnit's Project capabilities. This enhancement simplifies things greatly. It allows me to:

  • test/cover multiple assemblies at a time
  • shield my build script from having any knowledge about the test harness configuration.
  • specify where to locate the configuration file (this should solve my third objective)

To create the Nunit project file:

  • open the NUnit-Gui and choose "New Project". I like to save the nunit file at the root of my solution.
  • add in the assemblies that contain your test fixtures. In my case, I have to specify the relative path to my Test Harness: /Test/bin/Debug/Test.dll

As an NUnit project file is simply an xml file, my nunit project file contains the following:

<NUnitProject>
  <Settings activeconfig="Debug" />
  <Config name="Debug" binpathtype="Auto">
	<assembly path="Test\bin\Debug\Test.dll" />
  </Config>
</NUnitProject>

With this in place, I can now generate my coverage report at the command line with the following statement:

ncover.console nunit-console TestProj.nunit //a Core

So far, this pretty good. The second objective was fairly easy, and although I've violated my first objective, Kiwidude has suggested this might be something I could correct with part of the build script, so I'll get to that soon enough.

Tomorrow, I'll tackle my third objective of being able to resolve configuration settings.

submit to reddit