Tuesday, November 29, 2005
Friday, September 02, 2005
Monday, June 06, 2005
Wednesday, May 25, 2005
In a recent project, we needed to render .net controls dynamically. Instead of using ascx controls, we used XSLT to transform static Xml data into dynamic presentation layers; a viable approach to providing multiple look and feels for the site. Here's a quick walkthrough of how it works.
Dynamically loading the HTML is simple, using the built-in ParseControl method of the System.Web.UI.Page class.
Control c = Page.ParseControl( "<b> Literal Text </b>");
The same logic can be applied to parsing the output of an Xslt transformation.
string transformOutput = xslTransformer.Transform();
Control c = Page.ParseControl(transformOutput);
If you want to add server-side controls into the transformation, there are a few gotchas:
- XSLT parser needs namespace definitions for .net and custom tag prefixes.
- Namespaces appear as attributes in the formatted output, which will need to be stripped out before adding to the Page.Controls collection.
If you're adding asp.net controls or custom controls into your XSLT file, you'll need to declare the namespace at the top of the xslt header, otherwise the XML Parser will complain:
Incidentally, if you're using Visual Studio 2003, adding this namespace declaration will give you Intellisense for your asp.net controls in your XSLT file.
With all the proper namespaces defined, you'll bypass any nasty Xslt Parser errors, however, these namespaces will now appear scattered throughout your transformed output as attributes. If you leave these values in and try to parse the output into a control, the .net runtime will try to resolve the namespaces as attributes of the controls and will generate runtime errors.
Fortunately, a quick regular expression can strip all these namespaces out of the transformed output:
string transformedOutput = transformedOutput.Replace("xmlns:\w+=\"[^"]+\"", "");
With the namespaces stripped out of the transformed output, you should be able to parse the string into a control.
using System.Xml; using System.Xml.Xsl; using System.Web; XslTransform transform = new XslTransform(); transform.resolver = new System.Uri("http://localhost"); string xslFilePath = Server.MapPath("/controls.xslt"); transform.Load( xslFilePath ); string xmlFilePath = Server.MapPath("/data.xml"); string transformedOutput = transform.Transform( xmlFilePath ); transformedOutput = transformedOutput.Replace("xmlns:\w+=\"[^"]+\"", ""); Control c = Page.ParseControl( transformedOutput );
Page.Controls.Add( c );
by bryan at 12:00 PM
Monday, May 23, 2005
A few years back, I had seen a mailer engine that spit out email messages in EML format. I found this engine to be extremely helpful because I could turn off the SMTP service, run the mailer-engine, and then manually inspect the messages in Outlook Express. If the messages were fine, I re-enabled the SMTP service.
At the time, it was clear that the engine was leveraging the CDO.Message object's ability to persist to a file.
Recently, I wondered if this was possible using the .net framework, but was dissapointed to find out that the System.Web.Mail.SMTPServer object only exposed the method "Send".
The SMTPServer implementation is very interesting. Reflection shows that there are three internal classes:
As it turns out, the SMTPServer object acts as a proxy between the legacy COM objects, "CDO.Message" and "CDONTS.NewMail" -- it detects which environment you are using, then delegates the sending of the message to the "Send" method to the appropriate COM object.
What's equally interesting, is that the .net framework doesn't use COM Interop to talk to the COM objects. Instead, it uses the LateBoundAccessHelper to instantiate the COM object by it's ProgId, and then manually sets its properties and methods using Late Binding.
To stream my messages out into EML format I suppose I could use COM Interop to the CDO library, but that would mean I would have to package the interop wrapper (and its dependencies) with my assembly.
I'll have to take a look at extracting this code using a Reflector add-in, and extending the class to save to a file instead of sending directly. It'll be interesting if I can do this without adding an Interop wrapper to CDO. I wonder if there'll be any performance drawbacks in this approach...
yet another side-project...
by bryan at 1:01 PM
Saturday, May 21, 2005
At the end of Star Wars: Episode III Revenge of the Sith, everything wraps up nicely, making a decent bridge between episode III and IV. However, it's a surreal experience, to see the end of a saga complete itself in the middle.
So how was it? Prior to going to the show, everyone gave me one of two reviews. Either "Good, the first two were bad." or "Awesome, the first two sucked." Clearly the delta between good and bad and awesome and sucked or equal.
Personally, I enjoyed it -- or rather, parts of it. Unlike all other star-wars movies there seems to be a whole lot more story to tell: from Annakin's fall to the darkside and the dawning of Darth Vader, to the end of the clone wars and the death of the republic, not to mention the wrapping up of one trilogy and the segway to the next.
This bulk of multiple stories can't be told with action alone, so there's a whole lot of Lucas-style dialogue that has to find its way into two hours and twenty minutes. As result, I found it to be very fragmented -- only a few seconds of setup for a scene, some forced dialogue, followed by the traditional Star Wars "swipe" to the next story.
But it's done well, and clearly one of the lessons Lucas has learned since the last two films is the special effects are the vehicle, not the story. Revenge of the Sith uses the technology from the last two films transparently. There isn't a twenty minute Pod race, nor a pointless cgi character that doesn't seem like a ILM "show off". Yoda and General Grievous are done so well, you become absorbed by the complexities of their character rather than than their near-life-like representation.
All in all -- it may be Lucas's redemption for the Star Wars saga. I might even see it again.
by bryan at 12:18 PM
Monday, May 09, 2005
Thursday, May 05, 2005
The new star wars movie is upon us in a few weeks. I'm a little undecided.
I grew up as a star wars fan. I had all the toys, the books ... the works. Back then we didn't have VCR's, so the trip to the theatre meant your brain was on full record. I had known since I was a kid that there was supposed to be nine movies. I had played it out in my head what they were supposed to be about, and I always figured that the 9th one was where Darth Vader bit it. (Side rant: How surprised was I when Luke pulled Darth Vader's helmet off in ROTJ???)
Needless to say, I had some high expectations for Episode I. But I played it down, was cool about the whole thing. I remember sitting in the theatre, just before the show was about to start, and it felt like any other movie experience. But when the green lucas film logo faded and the bright blue familiar "far far away" catch-phrase started to fade it dawned on me... "I have no idea what comes next" and I was instantly 7 years old again.
The last two have been pretty disappointing. I don't need to go into the details of that. The whole thing seems like a bad car wreck, you don't want to watch but you're strangely compelled.
At this point, the trailer looks pretty good. Actually, the trailer looks *too* good. But still, Lucas could screw it up. After all, he's considered some sort of editing genius -- he could have given very specific instructions for the trailer crew, "Be sure to exclude all suck-ass parts".
You never know, Jar-Jar could show up with a light-saber screaming "Mesa using-da force!"
Please Mr Lucas, don't screw this one up. If you do, my enter childhood memories will be forfeit. At the very least, please tell me you hired a dialogue coach.
I'll show up with my brain set for record.
by bryan at 11:31 PM
Friday, February 18, 2005
This month our client migrated their servers to another environment. When the actual date for the migration was upon us, it felt a lot like the Moving Van had arrived at the client's home and he was in his bathrobe frantically trying to wave it off for a few more days. For the most part, the server migration went fairly well, with some issues (big and small). I've outlined a few of them -- some of which drove me crazy. MSXML 4 - Access Denied We've got a neat little flash microsite that pulls an xml feed from an external site using classic asp. Interestingly enough, the simple ServerXMLHTTP method .Send() for a simple URL was returning an Access Denied error. Turns out, this is a feature of security hardening in MSXML4 SP2. I had to change the Local Security Policy, add the URL to the Trusted Sites internet zone in Internet explorer and REBOOT the server. Quite a bit of hassle just for a xml feed. Cannot resolve conflict in Collation Restored databases from the old production envrionment onto the new environment, and found that some applications weren't behaving as expected. When poking into the error, I found that i was receiving an error based on the current Collation (the language and sort order of the database) between databases were different. This was probably because the regional settings between the machines were different, and the databases that were created on the server defaulted to an incompatible. To resolve I had to: 1) Create a new version of the database with a different name 2) Use an ALTER DATABASE statement to set it to the desired collation. 3) Script the original database into a single block of SQL DDL statements 4) Remove all collation specific references on fields as the script would try and create varchar fields with specific collations. 5) Use a DTS task to copy the data from one database to the other, specifying in the task to Use Collation so it would adopt the collation of the target machine. 6) Drop the original database and rename the new version to reflect the original name. Cannot enroll in new Transaction Brilliant. Originally I asked the new hosting provider two months ago if they had any best practices on how to configure an environment with a firewall between the database and web servers. The only answer I received from their tech team was to use port 1433 -- which in lamens terms, is like saying cars need gas -- SQL always uses port 1433. The problem I knew we were going to have is when you actually try and use distributed transactions from the web server: there is a lot of communication between the web server and database -- way more than just port 1433. When I found out that they weren't aware of this concern, that should have been my first guess. I gave up that the hosting provider was going to make this easy for me, so I provided them very clear instructions on how I was going to configure DCOM to restrict the web and database servers to use specific ports. I clearly told them that once this was done, I would need two way (inbound/outbound) communcation on these ports. I outlined very specifically which ports i needed TWO WAY communication. When I recieved an email confirmation that they had opened the ports for TWO WAY communication, I politely thanked them and went back to the process of configuring my applications. When I received the unable to enlist in new transaction error, I was a bit suprised, but as I didn't have a whole lot of time to fully test the application in the new environment, I wasn't that surprised. I thought I might be having problems with incorrect registry settings, or name resolution, etc. It was about fourty minutes later, after double-checking my settings and reading knowledge base articles on this problem, that I discovered that the ports had been opened for the web server, but not the database. The email I sent the hosting provider, to which I attached my previous email with the clearly outlined instructions, retrospectively, wasn't that polite. I only wrote SOME of the email in ALL CAPS. (Incidentally, why is it THAT ALL CAPS LOOKS LIKE YOU'RE SHOUTING?????) Unable to convert varchar to datetime When I realized that the default regional setting of the server didn't really help, I went digging into the code. We had a form that collected the data in a very specific format: Please provide your date of birth (yyyy/mm/dd): At the code level, some very ancient asp classic code that was building the sql statement inside the script (terrible!!!) and opening the recordset with the target SQL. strSQL = "SELECT count(*) from myTable where DateCreated = ' " & Request.Form("TimeStamp") & " ' " oRs.Open strSQL, oConn Brutal. Here we're basically asking the SQL server to try and resolve the text into a datetime using the text format the user supplied -- if your SQL box is configured with a different time format, you're pretty much screwed. While writing inline sql inside your presentation code is considered extremely bad form, I can appreciate the developer's complaint that it's too much work to write a custom COM object just to write a silly database call. But if you have to use inline sql, you should at least attempt to use a stored procedure. If you so lazy that you can't write a stored procedure and you have to write inline sql because you're a toe head, then heaven forbid you write a few extra lines of code and use a parameterized sql statement, as so: strSQL = "SELECT count(*) from myTable where DateCreated = ?" Dim oCmd set oCmd = Server.CreateObject("ADODB.Command") oCmd.Parameters.Append oCmd.CreateParameter("TimeStamp", adDBTime, adParamInput, 8, Request.Form("TimeStamp") ) oRs.Open oCmd, oConn, 1, 3 Although it's a couple extra lines, I sleep better knowing that ADO will take care of the datetime format conversion.
by bryan at 6:12 PM
Sunday, January 30, 2005
One of my current projects involves relocating my client to a new hosting environment. This is proving to be a challenge as I don't really know the application that well, yet. The code is hybrid asp/.net, and one of the bugs that is totally driving me crazy is actually date formatting. The old server environment was configured with English US as the default locale, whereas the new environment is English Canada. As a result, I have a few database inserts that are using VBScript's Now() function to write the current time-stamp. (Not exactly why the database isn't providing this functionality -- but that's another problem). What's odd is that I changed the default locale, but wasn't seeing any changes in the old asp code -- the date format was still wrong.... I had changed the regional settings (or at least I thought I had) and couldn't figure out why I was seeing the proper date format from the command-line but not in asp. The glaring mistake I made was that I had missed the "Set Default" button at the bottom of the control panel. If you forget to click that -- then you're only applying changes to the current user profile. The link for this article points to a ms kb article on some new options for the regional settings for w2k sp2+. I'll come back to that if it proves to be a problem with my .net code.
by bryan at 2:43 PM
Wednesday, January 19, 2005
Lori and I have been counting down the days to our house closing on a chalkboard in the kitchen. Our real estate agent just called us and asked how we felt about moving the closing date up by a month! I'm all for moving in early, but we've already paid for our March rent. We're trying to figure out a way to maximize our move-in time while minimizing the money going out.
by bryan at 10:55 AM
If code is half-science and half-art, then this is either code reflecting life, or geeks seeing code when ordering coffee.
by bryan at 10:51 AM
Tuesday, January 11, 2005
Over the last year, our organization has been migrating our version control systems to subversion. Man, what a ride. Previously, I've grown accustomed to Visual Source Safe, which when looking back seems like -- well, for lack of a better term, like sticking your code into a shredder and setting it on fire. While it does have it's advantages (it doesn't require any brain-power to use) -- it doesn't do complex things well, if at all. If you go down the path of pushing the envelope and try to make it behave like an enterprise version control system, ... it'll bite back. hard. Am I surly? Sure, why not. Unfair? Well, you try repairing a 3 Gig corrupted vss database and call me unfair. Subversion is a far cry from the simplicity of VSS. It's command-line utilities only (there are open-source GUI's, but I haven't heard any good things), the concepts are radically different, and it doesn't plug into the IDE. I'm not complaining -- these are advantages. Subversion requires you to think differently about what you consider source-control. Gone are the days of my team complaining that they "...can't work on that file because it's currently checked out." At any rate, I won't be going back to VSS any time soon. I'm sold on subversion. Coupled with Cruise-Control.net, we've got a pretty sweet development-integration cycle happening. As of late, I've switched projects recently, a form of promotion if you want to call it that, and I'm now responsible for a large account that pushed out about 62 projects last year. This means we are branching and merging in subversion all the time. At the rate we're going, I'll be a subversion guru in a week....
by bryan at 6:49 PM
Thursday, January 06, 2005
Lately, these articles have sparked a lot of interest within our app dev group. Provider Model Design Pattern, Part 1 Provider Design Pattern, Part 2 I had a client that was experiencing problems with their monolithic application, in that they were adding microsites to their site on a weekly basis, and found that they had to recreate the wheel for their application framework for each microsite. At one point, they had up to 20 different assemblies referenced for each microsite. In addition, they wanted to keep the Personal Information in their production database private, thus they didn't want to hand out their schema or access to their database. On top of that, they had a multitude of 3rd party vendors who supplied code to them. Each had their own set of coding standards and beliefs about the application's responsibilities. Including us ;-) We looked at building an abstract framework that all the code would reference. The trick was to make everything extremely generic so that it could be 100% transparent to the developers. This meant pushing code out of base classes and into an abstract version in the framework, or out of the project entirely and into HttpModules. The abstract framework tied to concrete providers, as in the msdn articles above, and allow third party vendors to write and use their own providers for their own environments, and allowed the client to use the same application-framework for multiple small-to-large size applications with different configured providers. I've found these links useful, so I'm posting them here so that I can find them when I need them, but if you stumble upon them, maybe you'll find them useful.
by bryan at 4:02 PM