Wednesday, January 04, 2006

Microsoft Google Search

I don't know how many times I've tried to weed out what I was looking for in a search result. Turns out, Google has probably noticed that I (ok, and the rest of the planet) have done a lot of Microsoft related searches.

In the Advanced Search of Google, they've provided customized search pages for common searches, including:

  • Apple
  • Macintosh
  • BSD
  • Unix
  • Linux
  • Microsoft
  • and others

Even better -- you can add the Microsoft search to Firefox as a Search Engine Plugin

Friday, September 02, 2005

Classic Computer Magazine Archive

How old school is this? Fond memories.

Monday, June 20, 2005

Das Keyboard

Ya, you typing.

Monday, June 06, 2005

UltraMon

I've been running a laptop and a flat-screen monitor for a few months now and stumbled upon UltraMon. It's a commercial product but it provides a trial version. After reading it's specs, I'm definately going to check it out.

Wednesday, May 25, 2005

Dynamic control generation using XSLT

In a recent project, we needed to render .net controls dynamically. Instead of using ascx controls, we used XSLT to transform static Xml data into dynamic presentation layers; a viable approach to providing multiple look and feels for the site. Here's a quick walkthrough of how it works.

Dynamically loading the HTML is simple, using the built-in ParseControl method of the System.Web.UI.Page class.

Control c = Page.ParseControl( "<b> Literal Text </b>");

The same logic can be applied to parsing the output of an Xslt transformation.

string transformOutput = xslTransformer.Transform();
Control c = Page.ParseControl(transformOutput);

If you want to add server-side controls into the transformation, there are a few gotchas:

  1. XSLT parser needs namespace definitions for .net and custom tag prefixes.
  2. Namespaces appear as attributes in the formatted output, which will need to be stripped out before adding to the Page.Controls collection.

If you're adding asp.net controls or custom controls into your XSLT file, you'll need to declare the namespace at the top of the xslt header, otherwise the XML Parser will complain:

xmlns:asp="http://schemas.microsoft.com/AspNet/WebControls"

Incidentally, if you're using Visual Studio 2003, adding this namespace declaration will give you Intellisense for your asp.net controls in your XSLT file.

With all the proper namespaces defined, you'll bypass any nasty Xslt Parser errors, however, these namespaces will now appear scattered throughout your transformed output as attributes. If you leave these values in and try to parse the output into a control, the .net runtime will try to resolve the namespaces as attributes of the controls and will generate runtime errors.

Fortunately, a quick regular expression can strip all these namespaces out of the transformed output:

string transformedOutput = transformedOutput.Replace("xmlns:\w+=\"[^"]+\"", "");

With the namespaces stripped out of the transformed output, you should be able to parse the string into a control.

using System.Xml;
using System.Xml.Xsl;
using System.Web;

XslTransform transform = new XslTransform();
transform.resolver = new System.Uri("http://localhost"); 
string xslFilePath = Server.MapPath("/controls.xslt");
transform.Load( xslFilePath );

string xmlFilePath = Server.MapPath("/data.xml");
string transformedOutput = transform.Transform( xmlFilePath );
transformedOutput = transformedOutput.Replace("xmlns:\w+=\"[^"]+\"", "");

Control c = Page.ParseControl( transformedOutput ); 
Page.Controls.Add( c );

Cheers

submit to reddit

Monday, May 23, 2005

Poking around in SMTP

A few years back, I had seen a mailer engine that spit out email messages in EML format. I found this engine to be extremely helpful because I could turn off the SMTP service, run the mailer-engine, and then manually inspect the messages in Outlook Express. If the messages were fine, I re-enabled the SMTP service.

At the time, it was clear that the engine was leveraging the CDO.Message object's ability to persist to a file.

Recently, I wondered if this was possible using the .net framework, but was dissapointed to find out that the System.Web.Mail.SMTPServer object only exposed the method "Send".

Enter Reflector.

The SMTPServer implementation is very interesting. Reflection shows that there are three internal classes:

  • CdoSysHelper
  • CdoNtHelper
  • LateBoundAccessHelper

As it turns out, the SMTPServer object acts as a proxy between the legacy COM objects, "CDO.Message" and "CDONTS.NewMail" -- it detects which environment you are using, then delegates the sending of the message to the "Send" method to the appropriate COM object.

What's equally interesting, is that the .net framework doesn't use COM Interop to talk to the COM objects. Instead, it uses the LateBoundAccessHelper to instantiate the COM object by it's ProgId, and then manually sets its properties and methods using Late Binding.

To stream my messages out into EML format I suppose I could use COM Interop to the CDO library, but that would mean I would have to package the interop wrapper (and its dependencies) with my assembly.

I'll have to take a look at extracting this code using a Reflector add-in, and extending the class to save to a file instead of sending directly. It'll be interesting if I can do this without adding an Interop wrapper to CDO. I wonder if there'll be any performance drawbacks in this approach...

yet another side-project...

submit to reddit

Saturday, May 21, 2005

The saga ends

At the end of Star Wars: Episode III Revenge of the Sith, everything wraps up nicely, making a decent bridge between episode III and IV. However, it's a surreal experience, to see the end of a saga complete itself in the middle.

So how was it? Prior to going to the show, everyone gave me one of two reviews. Either "Good, the first two were bad." or "Awesome, the first two sucked." Clearly the delta between good and bad and awesome and sucked or equal.

Personally, I enjoyed it -- or rather, parts of it. Unlike all other star-wars movies there seems to be a whole lot more story to tell: from Annakin's fall to the darkside and the dawning of Darth Vader, to the end of the clone wars and the death of the republic, not to mention the wrapping up of one trilogy and the segway to the next.

This bulk of multiple stories can't be told with action alone, so there's a whole lot of Lucas-style dialogue that has to find its way into two hours and twenty minutes. As result, I found it to be very fragmented -- only a few seconds of setup for a scene, some forced dialogue, followed by the traditional Star Wars "swipe" to the next story.

But it's done well, and clearly one of the lessons Lucas has learned since the last two films is the special effects are the vehicle, not the story. Revenge of the Sith uses the technology from the last two films transparently. There isn't a twenty minute Pod race, nor a pointless cgi character that doesn't seem like a ILM "show off". Yoda and General Grievous are done so well, you become absorbed by the complexities of their character rather than than their near-life-like representation.

All in all -- it may be Lucas's redemption for the Star Wars saga. I might even see it again.

Monday, May 09, 2005

Star Wars Episode III: Revenge Of The Sith - Review Digest

I nearly soiled myself while reading this.

Thursday, May 05, 2005

Star Wars III hype

The new star wars movie is upon us in a few weeks. I'm a little undecided.

I grew up as a star wars fan. I had all the toys, the books ... the works. Back then we didn't have VCR's, so the trip to the theatre meant your brain was on full record. I had known since I was a kid that there was supposed to be nine movies. I had played it out in my head what they were supposed to be about, and I always figured that the 9th one was where Darth Vader bit it. (Side rant: How surprised was I when Luke pulled Darth Vader's helmet off in ROTJ???)

Needless to say, I had some high expectations for Episode I. But I played it down, was cool about the whole thing. I remember sitting in the theatre, just before the show was about to start, and it felt like any other movie experience. But when the green lucas film logo faded and the bright blue familiar "far far away" catch-phrase started to fade it dawned on me... "I have no idea what comes next" and I was instantly 7 years old again.

The last two have been pretty disappointing. I don't need to go into the details of that. The whole thing seems like a bad car wreck, you don't want to watch but you're strangely compelled.

At this point, the trailer looks pretty good. Actually, the trailer looks *too* good. But still, Lucas could screw it up. After all, he's considered some sort of editing genius -- he could have given very specific instructions for the trailer crew, "Be sure to exclude all suck-ass parts".

You never know, Jar-Jar could show up with a light-saber screaming "Mesa using-da force!"

Please Mr Lucas, don't screw this one up. If you do, my enter childhood memories will be forfeit. At the very least, please tell me you hired a dialogue coach.

I'll show up with my brain set for record.

Even the bad guys have feelings...

The Darth Side: Memoirs of a Monster

Friday, February 18, 2005

Movin Madness

This month our client migrated their servers to another environment. When the actual date for the migration was upon us, it felt a lot like the Moving Van had arrived at the client's home and he was in his bathrobe frantically trying to wave it off for a few more days. For the most part, the server migration went fairly well, with some issues (big and small). I've outlined a few of them -- some of which drove me crazy. MSXML 4 - Access Denied We've got a neat little flash microsite that pulls an xml feed from an external site using classic asp. Interestingly enough, the simple ServerXMLHTTP method .Send() for a simple URL was returning an Access Denied error. Turns out, this is a feature of security hardening in MSXML4 SP2. I had to change the Local Security Policy, add the URL to the Trusted Sites internet zone in Internet explorer and REBOOT the server. Quite a bit of hassle just for a xml feed. Cannot resolve conflict in Collation Restored databases from the old production envrionment onto the new environment, and found that some applications weren't behaving as expected. When poking into the error, I found that i was receiving an error based on the current Collation (the language and sort order of the database) between databases were different. This was probably because the regional settings between the machines were different, and the databases that were created on the server defaulted to an incompatible. To resolve I had to: 1) Create a new version of the database with a different name 2) Use an ALTER DATABASE statement to set it to the desired collation. 3) Script the original database into a single block of SQL DDL statements 4) Remove all collation specific references on fields as the script would try and create varchar fields with specific collations. 5) Use a DTS task to copy the data from one database to the other, specifying in the task to Use Collation so it would adopt the collation of the target machine. 6) Drop the original database and rename the new version to reflect the original name. Cannot enroll in new Transaction Brilliant. Originally I asked the new hosting provider two months ago if they had any best practices on how to configure an environment with a firewall between the database and web servers. The only answer I received from their tech team was to use port 1433 -- which in lamens terms, is like saying cars need gas -- SQL always uses port 1433. The problem I knew we were going to have is when you actually try and use distributed transactions from the web server: there is a lot of communication between the web server and database -- way more than just port 1433. When I found out that they weren't aware of this concern, that should have been my first guess. I gave up that the hosting provider was going to make this easy for me, so I provided them very clear instructions on how I was going to configure DCOM to restrict the web and database servers to use specific ports. I clearly told them that once this was done, I would need two way (inbound/outbound) communcation on these ports. I outlined very specifically which ports i needed TWO WAY communication. When I recieved an email confirmation that they had opened the ports for TWO WAY communication, I politely thanked them and went back to the process of configuring my applications. When I received the unable to enlist in new transaction error, I was a bit suprised, but as I didn't have a whole lot of time to fully test the application in the new environment, I wasn't that surprised. I thought I might be having problems with incorrect registry settings, or name resolution, etc. It was about fourty minutes later, after double-checking my settings and reading knowledge base articles on this problem, that I discovered that the ports had been opened for the web server, but not the database. The email I sent the hosting provider, to which I attached my previous email with the clearly outlined instructions, retrospectively, wasn't that polite. I only wrote SOME of the email in ALL CAPS. (Incidentally, why is it THAT ALL CAPS LOOKS LIKE YOU'RE SHOUTING?????) Unable to convert varchar to datetime When I realized that the default regional setting of the server didn't really help, I went digging into the code. We had a form that collected the data in a very specific format: Please provide your date of birth (yyyy/mm/dd): At the code level, some very ancient asp classic code that was building the sql statement inside the script (terrible!!!) and opening the recordset with the target SQL. strSQL = "SELECT count(*) from myTable where DateCreated = ' " & Request.Form("TimeStamp") & " ' " oRs.Open strSQL, oConn Brutal. Here we're basically asking the SQL server to try and resolve the text into a datetime using the text format the user supplied -- if your SQL box is configured with a different time format, you're pretty much screwed. While writing inline sql inside your presentation code is considered extremely bad form, I can appreciate the developer's complaint that it's too much work to write a custom COM object just to write a silly database call. But if you have to use inline sql, you should at least attempt to use a stored procedure. If you so lazy that you can't write a stored procedure and you have to write inline sql because you're a toe head, then heaven forbid you write a few extra lines of code and use a parameterized sql statement, as so: strSQL = "SELECT count(*) from myTable where DateCreated = ?" Dim oCmd set oCmd = Server.CreateObject("ADODB.Command") oCmd.Parameters.Append oCmd.CreateParameter("TimeStamp", adDBTime, adParamInput, 8, Request.Form("TimeStamp") ) oRs.Open oCmd, oConn, 1, 3 Although it's a couple extra lines, I sleep better knowing that ADO will take care of the datetime format conversion.

Sunday, January 30, 2005

What time is it?

One of my current projects involves relocating my client to a new hosting environment. This is proving to be a challenge as I don't really know the application that well, yet. The code is hybrid asp/.net, and one of the bugs that is totally driving me crazy is actually date formatting. The old server environment was configured with English US as the default locale, whereas the new environment is English Canada. As a result, I have a few database inserts that are using VBScript's Now() function to write the current time-stamp. (Not exactly why the database isn't providing this functionality -- but that's another problem). What's odd is that I changed the default locale, but wasn't seeing any changes in the old asp code -- the date format was still wrong.... I had changed the regional settings (or at least I thought I had) and couldn't figure out why I was seeing the proper date format from the command-line but not in asp. The glaring mistake I made was that I had missed the "Set Default" button at the bottom of the control panel. If you forget to click that -- then you're only applying changes to the current user profile. The link for this article points to a ms kb article on some new options for the regional settings for w2k sp2+. I'll come back to that if it proves to be a problem with my .net code.

submit to reddit

Wednesday, January 19, 2005

the countdown goes askew!

Lori and I have been counting down the days to our house closing on a chalkboard in the kitchen. Our real estate agent just called us and asked how we felt about moving the closing date up by a month! I'm all for moving in early, but we've already paid for our March rent. We're trying to figure out a way to maximize our move-in time while minimizing the money going out.

i'll have a low-fat soy half half-caf-decaf-cappucino with a lime and a twist

If code is half-science and half-art, then this is either code reflecting life, or geeks seeing code when ordering coffee.

Tuesday, January 11, 2005

svn up

Over the last year, our organization has been migrating our version control systems to subversion. Man, what a ride. Previously, I've grown accustomed to Visual Source Safe, which when looking back seems like -- well, for lack of a better term, like sticking your code into a shredder and setting it on fire. While it does have it's advantages (it doesn't require any brain-power to use) -- it doesn't do complex things well, if at all. If you go down the path of pushing the envelope and try to make it behave like an enterprise version control system, ... it'll bite back. hard. Am I surly? Sure, why not. Unfair? Well, you try repairing a 3 Gig corrupted vss database and call me unfair. Subversion is a far cry from the simplicity of VSS. It's command-line utilities only (there are open-source GUI's, but I haven't heard any good things), the concepts are radically different, and it doesn't plug into the IDE. I'm not complaining -- these are advantages. Subversion requires you to think differently about what you consider source-control. Gone are the days of my team complaining that they "...can't work on that file because it's currently checked out." At any rate, I won't be going back to VSS any time soon. I'm sold on subversion. Coupled with Cruise-Control.net, we've got a pretty sweet development-integration cycle happening. As of late, I've switched projects recently, a form of promotion if you want to call it that, and I'm now responsible for a large account that pushed out about 62 projects last year. This means we are branching and merging in subversion all the time. At the rate we're going, I'll be a subversion guru in a week....

submit to reddit

Thursday, January 06, 2005

Pluggable Architectures with Provider Models in .NET 1.1

Lately, these articles have sparked a lot of interest within our app dev group. Provider Model Design Pattern, Part 1 Provider Design Pattern, Part 2 I had a client that was experiencing problems with their monolithic application, in that they were adding microsites to their site on a weekly basis, and found that they had to recreate the wheel for their application framework for each microsite. At one point, they had up to 20 different assemblies referenced for each microsite. In addition, they wanted to keep the Personal Information in their production database private, thus they didn't want to hand out their schema or access to their database. On top of that, they had a multitude of 3rd party vendors who supplied code to them. Each had their own set of coding standards and beliefs about the application's responsibilities. Including us ;-) We looked at building an abstract framework that all the code would reference. The trick was to make everything extremely generic so that it could be 100% transparent to the developers. This meant pushing code out of base classes and into an abstract version in the framework, or out of the project entirely and into HttpModules. The abstract framework tied to concrete providers, as in the msdn articles above, and allow third party vendors to write and use their own providers for their own environments, and allowed the client to use the same application-framework for multiple small-to-large size applications with different configured providers. I've found these links useful, so I'm posting them here so that I can find them when I need them, but if you stumble upon them, maybe you'll find them useful.

submit to reddit

Wednesday, December 22, 2004

Some handy registry hacks

I found this handy link on Dave Wanta's blog regarding registry hacks. I especially love the command line autocomplete registry hack. I've recently adopted Subversion as my primary version control system, and have found that the command line is the best and only way to go. Having autocomplete at the command line is extremely handy.

submit to reddit

Thursday, December 02, 2004

Buying a house in Toronto - Part IV

Wow, part IV, like it's "A New Hope" or something... So with offer in the works, our agents worked furiously trying to contact the seller's agent and to setup a face-to-face presentation of the offer. The idea is that it's harder to laugh in someone's face than it is to a piece of paper. Once the offer is set into place, if accepted, we only have five days to get everything sorted out: financing, insurance, lawyers, house inspection, etc. So we spent most of the day trying to get a jump on all that paper work. Our agents managed to co-ordinate a face-to-face meeting for around 7:30... so the plan was to meet a nearby restaurant, sign some more formalized documents (a zillion times times four), and then have dinner while they negotiated. However, when we arrived at the restaurant, the seller couldn't contact his wife in time, and she went somewhere with their kids for a few hours. Since they both owned the house, they both needed to be there. She wasn't expected home until around 9:30. All this meant was that our pins-and-needles tension would only be dragged out longer than we expected. We had dinner, went home and sat by the phone and waited. Around 10pm, the phone rang. There were a couple things that had to be hammered out, ranging from the price to the alarm system. They had come down a little bit in the price, which was expected. Interestingly, they were waiting to sell the house before they started looking for a new home -- so they wanted additional time on the closing date. As first time house buyers, this was to our advantage. Now the ball was in our court; we only had an hour to decide. We could come up as much as they had come down, and that could go two ways: we would probably have another round of back-n-forth, or it would piss 'em off and they'd refuse our counter-offer. If we could pick the right psychological number, they'd be more inclined. We came up to the lower half of the halfway point, pushed the closing date out.... and waited for our agents to call us back. Around 11:30, the phone rang again, the offer had been accepted. According to our agents, the husband wanted to sell the house and his wife didn't. They hadn't begun to look for a house yet, and were waiting to see if they could sell the house before Christmas. As soon as the offer was accepted, she went white as a ghost and began to ball her eyes out. The only thing left was the house-inspection...

Buying a house in Toronto - Part III

As we returned to the city from our weekend getaway, we decided to take another route and do a drive by on the semi to get a better feel for the surrounding neighbourhood. Turns out, there were several more houses for sale in the general area. When our Agent called us the following afternoon, they had already looked at eight other homes that were listed. "Zero for Eight" -- all weren't even worth looking at. So we gave our agent the list of additional homes we looked at, and they went to work trying to set up appointments. Since the houses were in the same neighbourhood, we'd start at the semi and go from there. The asking price for the semi was higher than the farm house, but after taking a long second look, Lori didn't want to look any further. Although slightly smaller in size, it would not require any rennovations whatsoever and it had a spacious garage connected to a shared laneway. So we stopped, and decided to find a place to talk about it. We somehow found ourselves at the scummiest coffee shop in the seediest area. The working girls and drug dealers turned tricks while we sat inside and talked. We found out later that a new shopping mall with a more reputable coffee shop had plenty of room only a few blocks in the other direction. Oddly enough it didn't bother us. After some long discussions, my reservations with the place were put to bed, and we decided to put in an offer. It's funny how my negatives about the house seemed to disappear when we spoke of putting the offer in nearly 18K below their asking price. In Toronto, most of the houses sell well over the asking price, and that's mainly because of bidding wars, etc. However, this time of year is the best time to look mainly because no one wants to look / move / sell during Christmas, and in some cases, the market drops dead around Christmas and starts to pick up again around February. Once February rolls around, the prices start to inflate dramatically. So it was now or never. We put together the offer, listing all the items that would be included and excluded in the house, and any addtional conditions we could think of. Then we had to initial the documents in about a zillion places. We took a risk and decided to make our offer below asking price, and so that the seller wouldn't get pissed off, we took the appliances out of the offer. ... we were sold, but the question on would the seller agree to our conditions?