Thursday, December 28, 2006

Out of the silent blog

I know it's been a while since I last surfaced. The past couple of months haven't been a whole lot of fun, and I certainly haven't been my usual self. I'm not sure who I have been, or even where I have been at some points along the way.

I'm afraid I don't do muddled and confused very well. I've been severely depressed before. I've been known to become reclusive from time to time. I'm not even surprised by the occasional suicidal ideation anymore. But this is the first time that things have ever been this bad. Before I hit the hospital, I was in a dissociative fugue for about three weeks. Luckily, I was jolted back into the moment in the midst of hanging myself. Another thirty seconds, and the story would have been much different.

Did I actually want to die? Not at all. But I did have a couple of years worth of unresolved stress that I had entirely failed to deal with, and at some level I wanted so badly to be rid of it that I was willing to do whatever it took. Whatever, that is, except recognize that I had a problem and ask for help with it.

Since I walked into the CAMH emergency department, shaking uncontrollably and in tears, I've had a few good days, but most of the time I've been lost — I can't concentrate, I've had nothing that resembles a short-term memory at all, and I can find even the simplest problems overwhelming. Not exactly the ideal arrangement for a guy who's supposed to be able to make a living by making soluble molehills out of intractable mountains, is it?

SSRIs (specifically, fluoxetine) are (so far, at least) keeping the suicidal thoughts, or, rather, my willingness to act upon them, at bay, but I'm left with an enormous, undirected anxiety. No, I don't think anxiety is the right word; it's more of an adrenaline rush that simply won't go away. All of the physical components of anxiety without any of the emotional complications. For the moment, I'm controlling that somewhat with a low-dose bezodiazapene (clonazepam) every so often, but that's not an acceptable long-term solution, especially for someone who has a proven track record of susceptibility to addiction. It may be a while before I find the right medication. I'll know I've found it when that JavaScript library starts to make sense again.

I'm out of the hospital now, and have just moved back into the world from a Salvation Army post care facility. The Sally Anne? Yeah, I had done everything leading up to my move to ZA, including divesting myself of unnecessarily bulky winter clothing and remaining furniture, etc. I count myself incredibly lucky to have had my melt-down before getting to the airport; I can't imagine what might have happened had I arrived in a foreign country in a catatonic state.

For the next little while, my life will be a mix of doctors, support groups and the like. And I need the support, if for no other reason than that I need to vent a little of the frustration I feel at being reduced to a comparative idiot. All my life, I have been able to take my intellect for granted. It has been my defining personal quality, informing everything else that I am. Without it, or at least without full-time access to all of it, I am a little less sure of who I am and where I fit into this world.

It's not all bad news, though. There's always been something about being knocked off of my high horse that brings the truly important things in life back into sharp focus. My time at the Sally Anne reminded me that, despite my breakdown, things could always have been worse. I have become very active in AA again. If nothing else, my story can serve as an example to others that a bad break or two in sobriety is not a sufficient excuse for relapse, and that sobriety alone does not necessarily mean and end to the struggle of living. There is, after all, no problem so big that alcohol can't make it bigger. And whether my story helps anyone else or not, working with still-suffering alcoholics is pulling me out of my little cesspool of self-pity. I have only ever really been myself when I've forgotten about myself.

I hope to be able to get back into the development game again someday soon, but that day is not today. And if the day should never come, if I should never completely regain my abilities, that's okay too. I can be at peace with it. It was incredibly flattering to see myself described as a "legend" by colleagues I've admired since I began working with Notes and Domino, but my inability to accept that reputation was a large part of what led to my near-demise.

I am used to coasting through life. I have had a lifetime of accolades arising entirely from my innate talents, all the while knowing that I had been pulling the wool over everyone's eyes. I have played the great impostor, knowing enough of how to talk the talk that I could fool most experts into believing that I was an expert on any subject. This Lotus stuff was the first time I ever really put an effort into learning anything, and yet there was always the feeling in the back of my mind that I was fooling everyone again, that whatever reputation I had developed was a shoddy façade that would collapse the moment I was actually challenged. And let's face it, I don't know everything there is to know about Notes and Domino, even on my good days. I don't think anybody does, or that anybody can anymore. There's just too much there there. Nonetheless, I was determined that I would need to know everything before I could really feel like I knew enough. Talk about setting oneself up for catastrophic failure!

Thanks to all who have written. I needed the boost. And my most sincere wishes for you and yours that the new year bring happiness, love and contentment in generous measure.

Wednesday, September 20, 2006

Greetings from CAMH

A funny thing happened on the way to South Africa -- I ended up in a mental hospital instead. Still trying to figure out what's going on -- but as long as I'm here, the likelyhood that I'll find myself wandering the streets for days with a couple of suitcases in hand, pulling knives on strangers are fairly low. Could be severe depression, could be hypomanic bipolar. SSRIs seem to be working for now, but it will be a while before we know anything for certain.

I wish I had seen this coming. It could have saved a lot of people a lot of trouble, and may even have saved a life. Sorry.

Tuesday, August 29, 2006

Building the Behaviour Layer

With all of this talk of fancy libraries and so forth, and with the whole Domino web world abuzz with tales of Ajaxian glory, it's sometimes hard to remember that Domino already provides us with a lot of sophisticated functionality. You say you need a field to refresh? There's a checkbox on the Field Properties box to handle that for you. Sure, it takes a post-back to do it, but most web application platforms have done much the same thing since the dawn of time (about eight years or so ago). All Ajax brings to the party is immediacy and fluidity, really. You can make an Ajax app last all day on a single page — something like Writely is meant for that sort of user experience — but the average Domino-suited application (or application suite) is more generally suited to defined "locations". That is, each task type would have its own workspace. Changing pages is not a dread thing to be avoided at all costs.

Okay, what does all of this blathering mean? It means that Domino views, forms and pages created in Designer already do pretty much what our users expect them to do. They just might not do them in the most elegant of ways. For instance, it may take a user several seconds (during which, of course, she has been typing madly away to record one of those elusive thoughts that will never be formulated quite the same way again) to realise that the form has already been submitted for refresh and all of her recent work has gone for naught. That is something that the immediacy and fluidity of Ajax can alleviate.

Now, as proud as I am of the bits of the library I've already put together, I am aware that as it stands, it still requires that a developer's efforts be concentrated on cooking a web application. atDbLookup() is way freakin' kewl (and will be so much kewler when I've gotten the automatic request and callback marshalling bulletproofed), but somebody still has to add it to a form, field, or button event in order to make it work. I really can't wait to see what other people can do with the tool once it has matured beyond the pre-alpha stage. I'm sure there are a couple or three web wonks out there who will build things of exquisite beauty when they don't have to concentrate so much on the mechanics of the app. They will, though, be aware that they are building a web application at every step along the way. Even something like prettyView() takes developer input, if only to add the code to the database and the passthru tag to the $$ViewTemplate. It doesn't have to be that way.

This is where Jack Ratcliffe and I meet face-to-face, so to speak. The Domino Web Tools project was meant to automate a lot of those processes, and I'm pretty sure that having direct Formula analogues handy will make that go a lot more smoothly. As I get deeper into the library project, it's easy to see how something like DWT can become overwhelming and go fallow for a while. Its a Project with a capital P. It was great to see Jack responding to the original library posting.

Right now, when you have "Use JavaScript to generate pages" turned on and select one of the refresh options on a field, Domino generates an event handler like this in the onchange: _doClick("$Refresh",this,null). Overriding a handler like that is very easy to do in JavaScript: document.forms[name].FieldName.onchange = someNewFunction. As long as it's called after the form is loaded, it's nearly foolproof. That means that right now, today, almost anyone reading this has all of the tools they need to create a "behaviour layer" that runs through Domino web page and hangs the stuff they want to hang off of field events, etc. It isn't at all difficult, but it is extra effort, and it is oh, so easy for the behaviour layer (the custom event script for a page) to get out of sync with the form or page design.

That's where the more ambitious part of Domino Web Tools comes in. In the early hours of the project formation, at least, there was much talk of automating as much of the webbification stuff as possible. You know, reading a form's design, say, and auto-generating the replacement parts. Adding datepicker widgets with the appropriate regionalisation to date fields, that sort of thing. It's all of that event code, though, that's the real sonofabee, at least when you have to create equivalent client-side web code. That is, I hope, where this library would make the greatest difference.

There are boatloads of Notes-client-only developers out there, many of whom never, ever stray beyond "pure Notes" applications (if they have to talk to another system, they'll use something like Notrix to do the talking), and who never step foot outside of the comfortable world of Formula Language. Many are only part-time developers — thay have "real" jobs filling up most of their time, and learning the intricacies of HTML, CSS and JavaScript is not something they'll give up their precious family time to do. Now imagine what it would be like for them to be able to create a truly usable, responsive, and (perhaps) even good-looking Domino web application by merely copying a couple of Design Notes into their database (or running an agent against the db to do it for them). Imagine the appearance transformed to look more like the thing they created in Designer. Imagine "Allow values not in list" still allowing a drop-down selection. Date fields with the date selector enabled still having them on the web. View dialogs and name-and-address pickers acting just the way you'd always imagined they should. Post-back behaviours replaced by in-browser updates. Computed fields computing right there in front of the user the way God intended. The one critical customer-interactive application a small Notes shop has does not have to be an aesthetic or functional embarrassment.

Ultimately, that's what I'd like to see happening. As much as I'm jazzed about super-innovative web development on Domino, I think that it may be far more important to the platform to make the low-hanging fruit hang a whole bunch lower.

It's one thing to add calls to a function like atDbLookup to a form. It's a relatively simple thing to make a server request to update something on the web page. But if you use the example I've given before, where the onchange of a field contains something like refreshSelectOptions("FieldName",atDbLookup("cache","","LookupView",this,2)), there is an implied contract of synchronicity. The refreshSelectOptions() function can't do what it needs to do until the atDbLookup() function has returned its values. Well, hell — we've just managed to lock up the user's browser the same way we would have by posting the form for refresh. Maybe not for nearly as long (and thanks to the way I do the caching, not every time, either), but there is that moment where the browser can't do much but wait for a reply. The big "A" at the start of the word "Ajax" stands for "Asynchronous". The object of the game is to notify the user that something is happening, but stay out of their way otherwise. That means divorcing the main activity from the event that caused it, and giving the request all of the info it needs to look after its own affairs when it's complete.

That gets a little dicey when you want to abstract things away from the developer. In normal Ajax development, one would be hands-on enough to know where you are making requests and what sort of response handler you should attach to the request. That is precisely the sort of thing I want to take off of the developer's plate. In the example above, if the lookup isn't in the cache, then atDbLookup() should immediately return something to the refreshSelectOptions() function so it can terminate and let the user get on with life. In the meantime, the lookup can take as long as it takes (that depends on the connection speed and server load), and when it's done, it should be able to complete what refreshSelectOptions() started. Making that work cross-browser and one hundred percent reliably while preserving the entire calling context is one of those easier-said-than-done things. (Return to main text)

Saturday, August 26, 2006

And people wonder why I prefer passthru....

Well, it seems that you folks were able to break my little piece of script pretty darned thoroughly. The "No Documents Found" bit was entirely my fault. Pulling any old bit of script off of the ol' hard drive and posting it with only minimal testing is not a good idea. It worked in the project I was doing, but that's because I had "pinned" documents so there was always something in the view.

The other breakage, though, I would never have picked up. You see, I never, ever, use Domino WYSIWYG to create headings and so forth. There is no semantic value to anything created with FONT tags, so I always make sure that things like that are done in passthru. And I never use the alignment settings, since it creates markup that you can't override with CSS. (The behaviour here was actually better in R5, since CSS could tell the browser to render a CENTER tag with any alignment you saw fit to use. The DIV with an inline style can only be changed using JavaScript, because inline styles will trump whatever is in the stylesheet.)

The problems with the script are twofold. First, if you create a WYSIWYG heading, Domino generates a DIV around the FONT tag soup it needs to render your semantically useless verbiage (sorry, folks, but unless you use an Hx tag to mark a heading, it ain't a heading). That wouldn't be so bad, except that it will eat the passthru DIV you created around the view by putting the heading's closing /DIV tag after your passthru. Easy to fix, you'd think -- just make sure there's at least one line of not-centred text between the heading DIV and the passthru, right?

Well, that does close the alignment DIV tag before the passthru DIV, but it creates a new problem. Domino may throw in an unclosed P tag, just for fun. Now, I have no problem with Domino rendering a line break with a paragraph. It might not be semantically correct when there is no actual paragraph content, but it's not a biggie, either. What burns my butt is the fact that the tag is never closed. That throws the whole document structure for a loop. The P will not be logically closed until its parent, Domino's FORM tag, is also closed.

There are three ways to try to handle this. The first is to futz around with the form in Designer until you can force Domino to render a BR instead of a P. A BR tag is not a container, so it won't interfere with stepping through the hierarchy in either direction. I'm not quite sure what the magic words are, or what face you have to make while doing it, but it CAN be done. Obviously, it's not particularly reliable, and the next developer to touch the form is going to break it. You can also try closing the P tag yourself, but this will get you no further -- eventually, someone is going to do something that changes the P to a BR. Or worse, they're going to try to get rid of the extra space between the heading and the view, closing off the DIV tag yet again.

The second is to expand the script yet again to cover all of the possible bases, searching every nook and cranny of the DOM until the view table, and only the view table, is finally found. You saw how big the original script was. Scanning the whole DOM tree until you find something that looks like it might be a view will make that little script just a little bit bigger. I'm not afraid of a little scripting (remember the library I keep talking about?), but I do resent having to write that much code to get around something that should never have been a problem in the first place.

The third is to lobby IBM for fixes. No, I don't expect them to add an HTML-prettifying function to the view rendering. In fact, it's just about the last thing I'd want to see, since it would mean that "normal" rendering could never be controlled with CSS -- you'd absolutely need to run some script to change the look of what Domino renders. (Yes, I know I'm running script now, but that's to add stuff that doesn't need to be there.) To my mind, something like that would be far worse than the current state of affairs. What I would like to see is a fix that gets Domino out of my way. If I start some passthru on a new line, and that whole line is marked as passthru, I want it to appear in the HTML source exactly where I put it, not inside a tag that Domino needs to render the previous line. In this case, if I wanted the viewPanel DIV to begin inside the heading alignment DIV, I should have to put in in the centre-aligned part of the text. Things like the document selection JavaScript should be moved completely out of the way -- the <head> sounds like a nice place to put that, doesn't it?. And since response-only columns are the last usable column in a response document row in a view, wouldn't it make a lot more sense to colspan the host cell all the way to the end (as this script does)?

No, folks, I am not whining about Domino's overall HTML rendering. While it can be improved somewhat (and Mark Vincenzes and team are working on it for Domino Next), it is very nearly impossible to create semantically-correct markup from purely visual formatting in a reliable way. A human being with good vision looking at the page can intuit the meaning of the text based on its formatting, but a machine can only do so much. What if your headings are smaller than the body text (a common-enough thing in trendy print publications)? There are some things we need to do ourselves if we want the end result to be meaningful and accessible. But there are a few little bugs that need to be squashed. They might not get the fixes into the current code streams (6.5.x & 7.x), but we can try to make sure that they are gone for Domino Next (no fixed number has been announced yet, but it has been stated that the number will be somewhere between 7.9 and 8.1).

In any case, the script as it now stands has been through the wringer. It'll find the view table if it's there and at least comes somewhere after the viewPanel DIV tag, as long as you don't put another table between the view and the div tag. If there's a "No Documents Found", it fails gracefully -- again, as long as you don't put an H2 tag into the middle of the actual viewPanel DIV AND use a centred, justified, or right-aligned WYSIWYG heading right above the passthru. (I'm only looking for the H2, since I can't predict the language of the message.) I've done everything I can to mangle the $$ViewTemplate and the view, and it seems to be bulletproof. Well, at least until one of you folks tries to use it, that is -- somebody's going to have done something I never would have though of trying, and that will probably result in Domino rendering the table cells outside of the table tag or something. Can you ever really win at this game?

Friday, August 25, 2006

Library again....

Working on the Formula Language analogue functions in the JS library has got me wondering. There are several @Functions (well, a lot of them, really) that will return either a single value or a list of values. In Formula Language, that's all fine and dandy, because except in a very few cases, EVERYTHING is a list. That is, you can spend most of your time oblivious to the actual output of the various lines of code you write; you don't need to make special concessions for "text" versus "text list". I've honoured that in the JS functions' inputs -- they can take a single value, an array of values, an HTML field object, or, in combination with atGetField(), a fieldname.

But what about the output? As it stands right now, I am creating the output in an array, then passing the array out if there is more than one value, and passing a scalar if there is only one value. While that's great if you want to cascade these "@Functions", I was thinking that it might make it unnecessarily difficult if you merely want to borrow a function or two to use in your own JS code. For instance, using atReplaceSubstring() is easier than writing something to iterate through two arrays, generate regular expressions on the fly, and do the replacement oneself. But is the return type uncertainty (array or scalar) going to be a problem? Should the return always be an array when the corresponding @Function allows a multi-value return?

For the record, these are the functions I've already completed:

  • function atDbLookup()
  • function atDbColumn()
  • function atExplode()
  • function atGetDocField()
  • function atGetField()
  • function atImplode()
  • function atLeft()
  • function atMember()
  • function atMiddle()
  • function atReplace()
  • function atReplaceSubstring()
  • function atRight()
  • function atSetDocField()
  • function atSetField()
  • function atSort()
  • function atTrim()
  • function atURLDecode()
  • function atURLEncode()
  • function atURLQueryString()
  • function atWord()

As much as I would like to have used a proper "@" in the names, it's an illegal character thanks to Microsoft's conditional evaluation. Okay, the list isn't very long yet, but the atDbLookup() and atDbColumn() functions took a while, as did atGetDocField() and atSetDocField(). All of them rely on XmlHTTPRequests, and the xxxDocField functions need a server-side agent to work. Okay, we're talking about the simplest, smallest web agent ever, but it's still an extra piece. And it will grow as more server-interactive atFunctions are added, like atUserRoles(). Even atURLQueryString() has more to it than you'd think. And don't get me started on atReplaceSubstring(). There are also a few helper functions to do listwise and permuted operations, since JavaScript doesn't know that [text array] + [string] should add the string to every member of the array.

I am also posting this, or something like it, on the LDD forum, but again, your input is valuable to me.

Thursday, August 24, 2006

SntT -- A prettier view

Not everything needs to be Ajaxian. A simple bit of JS can make a standard "Display using HTML" on a $$ViewTemplate look and work a whole bunch better. No hand-coded "Treat as" views, no fancy background requests need apply. Just a passthru <div> tag around the embedded view, this script, and init() in the onload event of the $$ViewTemplate.

By way of explanation for the long listing to follow:

First, I NEVER comment code quite this way in production. The comments here are for the benefit of people who may not be very familiar with JavaScript, and might have trouble following the "story" otherwise. If you want to use the code, do everybody a big favour and delete the comments. If you are at all familiar with JS, you'll probably find it easier to follow the code without the comments anyway. (They do get in the way, don't they?)

Second, there is a lot of explicit use of getElementsByTagName(). I'd never let something like this hit production with all of those wasted characters floating around in there. I use a simple little function instead:

function $tn(tn,el){el=el?el:document;return el.getElementsByTagName(tn);}

So, instead of:

var viewTable = viewPanel.getElementsByTagName("table")[0]

I would write:

var viewTable = $tn("table",viewPanel)

That is a half-truth at best. I'd probably write "var vw=$tn("table",vP)". Or at least have an obfuscator do it for me. I'm not "into" obfuscation, as a rule, but in an interpreted language where the user has to download the source code, sometimes over a bad POTS modem connection, killing off characters is the best way to save the plot.

The following little bit of JavaScript takes an ordinary, Domino-created, "Display using HTML" view and transforms it into something that looks and acts a little more "applicationy". It preserves the selection margin and any clickable headers. Flat? Single category? LOTS of categories? Response docs? Response-to-responses? Action bar? It's all good. Try it out, play with it. You may like it, you may not. And no, Peter, it doesn't mess with the correct functionality when you open and close categories -- the clicked category scrolls to the top if the page is long enough to be scrollable. Sorry about the formatting -- it's gonna be a little bit on the wide side. You'll want to copy and paste this into something that has syntax highlighting (like a JS script library in Designer) -- trying to make it really pretty here makes it too wide for the screen. Even if you have a fifty-incher.

function prettyView(){
  var debugPos = "";

This function adds a whole-row mouseover and click
event to a Domino "Display using HTML" view.

  var panel = document.getElementById('viewPanel');
  //assumes you have wrapped the view in a DIV with an ID of "viewPanel"

  Getting to the view may take some work. You KNOW the
  table lies inside your DIV, but Domino may just have
  closed your passthru DIV without asking.
  No, it SHOULDN'T happen. Yes, it DOES.
  //Try the easy way first
  var viewTable = panel.getElementsByTagName("table")[0];
  //If that didn't work...
  if (!viewTable) {
    //It might be because there were No Documents Found...
    if (panel.getElementsByTagName("h2").length) {
    //...or maybe Domino ate your DIV for lunch.
    else {    
      panel = panel.parentNode;
      if (panel.tagName) {
        viewTable = panel.getElementsByTagName("table")[0];
      //Of course, the No Documents Found rule could still be in effect...
      if (!viewTable && panel.getElementsByTagName("h2").length) {

  //First, fix the situation where a collapsed categorized view
  //is all squished over to the left-hand side of the browser
  //Then get the collection of rows.
  var rows = viewTable.getElementsByTagName("tr");
  if (rows.length) {
    //We don't want to mess with the header row if it's there.
    //It may contain column resort hinkies.
    //Domino 6 and up renders the header cells as TH elements.
    var startRow = (rows[0].getElementsByTagName("th").length)? 1 : 0;
    for (var i=startRow;i<rows.length;i++){
      //Now, make sure the row is in the view table.
      //Response documents may be rendered in nested tables.
      var grandparent = rows[i].parentNode.parentNode;
      if (grandparent === viewTable) {
        //the first "parent" is an imaginary tbody element
        //so it's row->tbody->table to get to viewTable
        var href = "";
        var titleText = "";
        //Add the mouseover highlight to each row...
          rows[i].onmouseover = new Function("this.bgColor='#ffff99'");
        //...and return to the original color on mouseout.
        //This maintains any alternate colors set in the View Properties.
        rows[i].onmouseout = new Function("this.bgColor='" + rows[i].bgColor + "'");
        //Now get all of the cells in the row...
        var cells = rows[i].getElementsByTagName('td');
        for (var j=0; j<cells.length; j++) {
            //There is going to be an error at the end of this TRY
            //related to garbage collection of the objects we create here.
            //It is unavoidable.
            //The best we can do is CATCH the error and ignore it.
            //We don't want to change the behaviour of cells in the
            //selection margin.
            if (!(cells[j].getElementsByTagName("input").length>0)) {
              //We need to find links to create the whole-row click.
              var links = cells[j].getElementsByTagName("a");
              if (links.length) {
                var count = 0;
                var link = links[0];
                //Not all A tags represent links. We will pass over
                //any named anchors (A tags with a NAME and no HREF).
                while (!link.href || link.href == "") {
                  link = links[++count];
                href = link.href;
                //We also need to know what's inside the link.
                var children = link.childNodes;
                var testNode = children[0];
                if (testNode && (typeof testNode == "object")){
                  if (testNode.tagName && testNode.tagName.toLowerCase() == "img"){
                    //In this case, it's a picture -- probably a twistie
                    titleText = testNode.alt;
                  else {
                    //Otherwise, there's got to be text in there somewhere.
                    //It may be nested in FONT tags, and there may be empty
                    //DOM nodes.
                    while (testNode.childNodes.length) {
                      testNode = testNode.childNodes[0];
                    while (testNode && testNode.nodeType!=3 && testNode.nodeValue!=""){
                      testNode = testNode.nextSibling;
                    //After all of that, we may not have any text...
                    if (testNode) {
                      //...but if we do, we replace the original link with plain text.
                      var swapNode = link.parentNode;
                      NOTE: Those last two lines help the view LOOK a lot prettier,
                      but they also affect accessibility. Keyboard-only users will
                      not be able to tab from link to link. If accessibility is
                      important, comment those two lines out.
                      titleText = "Click to open " + testNode.nodeValue;

                      For some unknown and unholy reason, Domino renders response
                      documents in nested tables in one column of the main table.
                      Not only does it make this sort of code harder (whine, grumble),
                      but it also means that response docs will shove the main document
                      content over to the right. This will fix that by removing
                      the final cells in the main table's response row and adding
                      their width to the response cell. The rest of the table can then
                      collapse back to normal size.
                      if (rows[i].getElementsByTagName("table").length) {
                        //This is a response doc, and we are stuck in a nested table.
                        //In order to keep the responses from pushing everything over,
                        //we need to find the outer cell containing the table...
                        var parentCell = cells[j].parentNode;
                        while (!parentCell || parentCell.nodeType != 1 || parentCell.tagName.toLowerCase() != "td") {
                          parentCell = parentCell.parentNode;
                        //...and work on getting rid of the following cells
                        var killCell = parentCell.nextSibling;
                        var removedCellCount = 0;
                        while (killCell) {
                          //Before removing any cells, we need to find out how wide they were.
                          var oldColspan = killCell.colSpan;
                          killCell = parentCell.nextSibling;
                          removedCellCount += oldColspan;
                        //Now we add the width we removed to the response cell...
                        parentCell.colSpan = parentCell.colSpan + removedCellCount;
                        //...add the onclick event ...
                        parentCell.onclick = new Function("getLink('" + href + "')");
                        //... and the mouseover text.
                        parentCell.title = titleText;
                        //Finally, we change the cursor to tell the user they're
                        //mousing over a link.
               = "pointer";
              if (href != "") {
                //If, after all of that, we have a link location to use,
                //we add an onclick to the cell to take the user to the link...
                cells[j].onclick = new Function("getLink('" + href + "')");
                //... and add the mouseover text.
                cells[j].title = titleText;
                //Finally, we change the cursor to tell the user they're
                //mousing over a link.
            //ignore -- it's because of nested tables on response rows

function getLink(){
  var el=arguments[0];
  if (typeof el == "string") {
    window.location.href = el;

function init(){

I try to avoid calling any function in the onload that isn't called "init()" -- that means I can change the function names in JavaScript with a search-and-replace and never have to worry about changing the body onload. The init() function calls the prettyView() function, and the prettyView() function adds an onclick call to the getLink() function. You'll need all three in your JS script library or JS Header.

You can improve the appearance of the view by adding the following CSS:

font-size: 1em;
border-collapse: collapse;
#viewPanel TR, #viewPanel TD, #viewPanel TH  {
border-bottom: solid black 1px;
#viewPanel TABLE TABLE TR, #viewPanel TABLE TABLE TD, #viewPanel TABLE TABLE TH {
border-bottom: none;

You need to set the border-bottom of the TD and TH in order to get any lines at all. The lines won't extend all the way across all of the cells, though, unless you also set the border-bottom for the TR. The border-collapse: collapse; makes sure that both sets of lines look like one. The selectors with TABLE TABLE in them make sure that the response document nested tables don't get multiple bottom borders.

UPDATE:Sometimes I hate this posting of snippets stuff SO much. There was a small problem with what I posted yesterday, in that it came from an earlier version of the original file. (A quick look at getLink() should tell you that it was excerpted from a bigger mess -- it's designed to handle links based on table row ids as well as href values.) The actual, honest-to-goodness production code lives in a template on a server (or group of servers) to which I haven't had access in a couple years, so I had to rely on what I had in text and *.js files here. I gave it a quick test before posting, but then I tested it again, and, well....

The changes live in the little loop where I go looking for the parent cell of response documents. The original code would break if something other than 10pt Default Whatever Plain is selected as the font for the responses-only column. That has been changed so the code will continue upwards to find the containing cell. I've also made the link replacement code two lines instead of one to solve a node resolution problem introduced when looking for the outer cell. For some reason, doing this:

was a problem, but doing this instead:
var theSameNode = someNode;
fixes it. As the code above implies, it's the same node. Not just the same HTML element, but the identical object. The identity check, (theSameNode === someNode), will return true. Yet the two-line version works in every browser I could test, and the one-line version fails in almost all of them. Only Opera, usually the worst browser for complex JS because it swaps engine components when you change its spoofing settings, actually got it right all of the time. Mozilla and IE would bail if the link was on a responses column with a font setting.

I've thrown in the only fix I could think of for the view title alignment problem noted in the comments. Oh, and the <h2>No Documents Found</h2> has been fixed, too (that was in the working original). NOW I know that multiple, seemingly-identical bits of snippetalia on a drive probably means that one is right and most of the rest are just-in-case backups that should have been nuked. Oh, well.


Friday, August 18, 2006

Speaking of huge libraries ...

... is there any interest out there in a general-purpose JS/Ajax library specifically for Domino?

I've seen adaptations of Prototype, Sarissa, Scriptaculous (and even Rico, which is a lovely library but wants everything in its own format) and so on, but would there be a market for a library of components that grok Domino? One that would not force you to write responses that conform to a library designed for other platforms? One that would include not only JS functions that already know about ?ReadViewEntries, categorized views and response documents, but design elements that can make up for the difference between server time and workstation time, that can prevent save conflicts, and so on? One that lets you dynamically refresh fields using, oh, let's say @DbLookup syntax in JavaScript?

If there is any interest, what features would you like to see? Be wild and scary if necessary. There's no law that says I have to include everything everybody could ever want. And I've already got a few tricks up my sleeve that may be as wild and/or scary as what you have in mind. I haven't quite rewritten the Formula Language engine in JavaScript yet, but I have borrowed a few of the more useful list-processing bits, along with their familiar-to-us syntax.

It is my intention to release such a library into the wild. The library itself will be free to use anywhere, both as in speech and as in beer. Thanks to View->Source and browser extensions, there's no good way around that. Even if the code is obfuscated (and it is, has been, and will continue to be, for compactness' sake), a determined developer can copy it and find a way to use it. The API documentation, though, I can charge for, so I will. Or, rather, somebody will.

The documentation will include the clear, unobfuscated code with comments. While the clear code would be entirely impractical as a user download (it is not a little file, or even a bunch of little files), forcing an interested developer to figure out which bits do what in the obfuscated code alone is not going to result in better code or in a better developer. Properly done, this might make a good addition to a certain training materials package, since the doco is as much about "why" as "what".

You see, I'd like to see this, or something not unlike it, become a standard part of the Domino developer's toolkit. While I would like to see every developer equipped with the knowledge and skills to do all of the work themselves, it just doesn't make sense that each of us needs to reinvent the wheel with every new application. Okay, the truth is that I was tired of reinventing wheels by the third go-around. So I've taken the bits and pieces of what I've been doing over the last year or so, located the reusable bits (and refactored to include them where they'd been redone), and come up with something I think is worth sharing. But I know that I've only included features that I needed at the time, and that there have to be things you might think are commonplace that have never occurred to me.

So if you can think of anything you'd like to see, anything that can help you elevate a garden-variety web-enabled database to a holy-crap-gotta-have-it application with minimum effort, let me know. Your suggestion may be what makes the whole thing worthwhile.

How nifty does it need to be?

A while back, Rocky Oliver floated the idea of resurrecting the Nifty Fifty, a group of fifty minimal sample applications that IBM threw into the box with every purchase of Notes 3.somethingorother. They didn't do much, and were never really intended to do much. They were fifty sets of ideas, fifty starting points for developers to build upon.

Lately, the idea has been getting a lot of play, thanks mainly to John Head. John suggested that, with Microsoft entering the fray and including application templates with the various incarnations of Sharepoint, the community should step up and contribute a set of "real" applications to run on Notes and Domino. The community consensus seems to be that the idea would not fly unless the apps were distributed and supported by IBM.

I guess it's time to throw my coupla cents in.

Should there be a set of application templates available that go beyond Discussion, Document Library and TeamRoom? You betcha. Does the set need IBM distribution and support? Again, you betcha. Do these applications need to be everything an organisation could lust after out of the box? Not on your nelly, nor on mine, neither. Not no-how.

What Domino needs is fifty (give or take) Really Good Ideas™ rolled into a neat little package. The applications need to be useful and usable, but each should really only try to do One Thing Right™. They should do that clearly and with voluminous documentation even for the most obvious bits. And while an application would need to include all of the ancillary bits that make its particular Really Good Idea™ work, it should absolutely cry out for extension and customisation.

A big part of what got me started down this track was jonvon's list of suggestions. In particular, it was the suggestion that there should be no obfuscated JavaScript (as one finds in Domino Web Access). As I've been doing a lot of webby things lately, it occurred to me that clear, self-documenting JavaScript on the scale required for a genuine Web Application™, complete with all of the neat Ajaxian features and behaviour layers and everything else that represents large-scale functionality these days, would make the needed libraries huge. Not merely big, but huge. Obfuscation might be a maintenance programmer's worst nightmare, but failing to make the downloadable component as small as it can be by using short names and proxy functions is working at cross purposes to the user community. Sure, it's easier for a developer to see what's going on and make modifications to the code (that is, without having the same obfuscator and replacement key file), but at a cost that can be as much as 80% larger code.

Let's take the Prototype library's $ function as a basic example. The name is impenetrable (and forget about documentation — Prototype was not intended for human developers to interact with, so it's enough that Rails understands what's going on), but by the simple act of replacing document.getElementById("someID") with $("someId"), you save twenty-two characters with every call, recouping the cost of the function in only two calls. Twenty-two characters might not sound like much, but one might make a call like that a hundred times or more in a complex script, and that's more than 2KB saved right there. Short variable names come with similar savings, as do snippet "constants" fed into the constructor of a function or an object.

When it means the difference between 2KB and 10KB, even over a bad POTS connection it's not worth sweating. But when the same ratio takes you from 30KB to 150KB, you start running into usability issues over dialup and on slow, crowded networks. Sure, the file is cached, but that's on a per-database basis (and what if the user is using several applications of the same or similar design?) and only after that first slow download. The fact that the application is better-stronger-faster after it loads means very little if the initial hit feels like one is installing Office. Users will not put up with slow, and it will be Domino's fault no matter who actually wrote the code. Chuck one application, and maybe even the platform it runs on. That means that if clarity and extensibility are among the chief aims of the project, as I believe they should be, then we have to aim lower than Google Maps or BaseCamp with our web examples.

I agree that the templates in the set need to be as simple and clear as they can be. We are, after all, trying to sell the possibilities of the platform, and if a developer can't figure out what's going on, she's going to have a hell of a time trying to build her own custom applications. And they should be no simpler than necessary. Each application needs to do something that will be useful to someone, if for no other reason than to encourage the "Notes guy" to open the thing up in Designer to take a look under the hood. There almost needs to be a hole in every application, though; something that is obviously missing, but not so obvious that the template will be discounted as useless right away. NiftyNext™ should be at least as much about teaching as it is about value added to the platform. Let the all-singing, all-dancing, impenetrably complex applications remain as they are now — commercial ventures and the province of internal corporate developers.

After all, if everything comes in the box, we're all out of work.

Friday, August 11, 2006

From the Mail & Guardian online:

When Australian cricket commentator Dean Jones was fired for calling Hashim Amla a "terrorist", the manne were delighted. After all, if everyone went around indulging provocative and childish stereotypes, the Oom might be tempted to call Jones a livestock-romancing wife-beating string-vest-wearing racist bigot Australian yahoo from the arse end of nowhere whose gigantic mouth is writing cheques his tiny brain can’t cash.

"Dinkum moron" by Krisjan Lemmer, 11 August, 2006

Tuesday, August 01, 2006


First off, I'd like to apologize for still not having gotten back to everyone who has written me. The flood of email has been a bit overwhelming, and I've only recently had the wherewithal to download the messages and compose replies offline (thanks again, Devin).

Among the messages have been proposals for independent contract work, possible positions in and around Toronto, and positions elsewhere. There has been an awful lot to consider, and not just in the philosophical, "you've given me a lot to think about" sense.

A big part of what I had to consider was the simple mechanics of getting back to work. As I mentioned before, my life had taken a bit of a downhill slide, and there is a point below which it becomes exceedingly difficult, if not quite impossible, to recover gracefully. I live in a dark, dank and moldy 90-square-foot basement room in a building (and neighborhood) populated primarily by the drug-addled and the insane. People who are self-medicating in extreme excess, and people who are failing to medicate adequately. The noise, the fights and the screaming can get to be a bit much. Working at home is difficult, but alternating doses of headphones and earplugs make it possible, and the occasional escape to the local intarweb cafeteria is welcome respite. Living a normal scheduled life, though, is pretty much out of the question. I sleep when I can, but I haven't had a stretch of time that would have allowed a full night's sleep in some time, and even then the time wasn't at night. So getting up at a normal time and reporting to work during normal office hours would be hit-and-miss at best. I haven't been particularly successful trying lately.

The fix is a simple one. I just need to move. But moving is expensive and disruptive, no matter how one looks at it. Staying in Toronto puts me in a chicken and egg situation -- I'd need to make a considerable amount of money relatively quickly in order to finance a move that would make a regular job possible, but until I move I won't be able to keep the regular hours that would let me keep a regular job.

My current circumstances, then, are not exactly conducive to a conventional approach. So I've decided on the nuclear option. Killing all of the birds in the vicinity with a single, powerful stone. Relocation. A fresh start in a new environment. New country, new surroundings, new type of work, the whole nine yards.

And so I follow friend Nathan to South Africa. Not to Joburg, though; I'll be heading for Cape Town. And not to the same kind of job, nor to the same sort of pay scale. Heck -- I ain't Nathan, and neither are most of you. But I couldn't ask for a better situation, really -- one foot firmly in the realm of the uber code monkey, and the other in the realm of education. That is, assuming I can manage to complete the seemingly trivial tasks of getting a passport and visa done without learning I'm PNG in a country I'm pretty sure I haven't visiteed before. Oh, and I have to hope that my criminal background check doesn't reveal any new and hitherto unknown details from my blackout days. (The hardest part of a return to consciousness was always hearing about my escapades for the first time. I'm pretty sure I know everything I should know now, but you never know, ya know?)

If all goes according to plan, I'll fill in the rest of the blanks for you soon. I'd still be doing the Notes and Domino thing, but I'd be spending a lot more time doing the things that I do best. And that's already coming too close to saying too much for now.

Saturday, July 01, 2006


First of all, I want to thank you all from the bottom of my heart for all of the support and encouragement I've received over the past few days. Frankly, I was at the end of my rope, and was absolutely sure that my programming career had come to an end. My previous posting wasn't meant to beg; it was just my best explanation of my absence. Thanks to all of you, though, things are not just looking up, but moving up as well. I will be replying personally to all of the email you've sent, but there's enough of it that it may take me a little while to get around to you. Oh, and enough with the donations already -- we're not quite at the dying-wish-scam level of monies or anything, but you have given me enough to take advantage of the other opportunities I've been presented, and if I manage to blow it from this point onward, then I would fully deserve to crash and burn.

(For those who are interested, and in the interest of fairness and transparency, the total was just under twelve hundred dollars Canadian, which is just about the total sum I had to work with over the previous ten weeks. It means I don't have to be particularly careful using the intarweb today. Since I have real prospects in the offing, I'd prefer to be off the community dole. I'm not going to turn down token recompense for actual help rendered, which is why the button was there in the first place, but I don't particularly like being a charity case. Them what's gone a bit overboard should expect return when I can afford it.)

But Brian Benz was right. This was a hard-learned and hard-earned lesson. There are a lot of things I could have done a lot differently along the way, and I'm pretty darned sure that I'll be doing a little more effective networking and leveraging from now on.

I've been coasting through life, depending on technical talent to get me where I wanted to go. Not that I haven't put any effort into things -- I have spent at least as much time as anyone else learning and practicing my craft (whatever that craft may have been at any given point in my life). That is as true in the Notes world as it was in my avionics life (something that disappeared when discrete electronics gave way to reliable LSIs) or even when I mastered the art of bringing funky old footwear back to life. But as Brian pointed out, mere technical competence is not enough for any of us anymore.

Even those of us who are "mere" employees can't afford to have an employee mentality these days. I come from a long line of labourers and tradesmen who have pretty much had a job for life as long as they did the work and didn't make too much noise along the way. I had pretty much the same thing with my first couple of jobs (including the military), and although I've watched the world around me change, I failed to change with it. I've always believed that there was work out there, and that all one could do, really, was to apply for jobs, shopping oneself out as a commodity labourer. But showing up at the factory gates with work boots and hardhat in hand, hoping that you were close enough to the front of the line to get one of the jobs on offer that day ain't cutting it anymore. That's more or less what I was doing, and it shouldn't have surprised me to find that everything was going to young 'uns with crisp new diplomas and little or no experience -- in the commodity world, price point is king. (And yes, I was testing the waters on the Dark Side as well — I can speak Dotnetese, even if my VB does come out with LotusScript accent and my C# sounds a lot like Java.) I simply didn't have the financial resources to last long enough to make that approach work, and that approach takes time. More time than any of us have.

I haven't had a chance yet to explore all of the "Stan needs help" traffic out there. I was pointed to Volker's posting by Wild Bill and Ben Dubuc in emails, and found Brian's ruminations from the comments there. I only had a half-hour to look at the mail, this blog, check the ol' PayPal account (there was an implied "don't bother" in my previous posting that I am now immensely happy several of you chose to ignore or failed to infer) and so forth -- I was able to scrape up a buck for that, but that didn't leave a whole lot of time to do anything else, really. I have seen enough, though, and in surprising enough places, to finally realise (I hope) that I am not, and cannot treat myself as, a commodity resource. Even if my name wasn't the proverbial household word, I really had no reason to believe that being just one more CV in the pile on some HR desk was going to get me anywhere.

You know, for a guy who spends so much time showing off his m4d 5|<i77z and hanging about in spaces with the elite in the game, I'm pretty clueless in real life.

The only thing that kept the week from being the best of my life, really, was that I was completely in the dark about the Toronto geek dinner with Ed Brill. I didn't have a chance to see Ed's posting on his own site, and the half-hour I was able to manage on the net on Wednesday was timed perfectly to end minutes before a personal invitation/exhortation hit my inbox. The word "dammit" comes to mind. Well, actually, "dammit" is what comes to finger — the word or words that actually come to mind are somewhat less socially acceptable. I believe that makes three times now that I've managed to avoid meeting Ed in person in my own back yard. That's definitely enough of that.

Again, thank you all. As much as the technical sharing in this community has always amazed me, this episode has me utterly gobsmacked.

Monday, June 26, 2006

Thanks for all the fish....

Some blogs die of neglect. Some are withdrawn by people who have second thoughts about being public. This one is dying of poverty and malnutrition. I can't afford to feed it any more.

This whole Notes developer thing has been a blast. The past tense really seems appropriate now, though. I don't have a computer to use since my ancient laptop's power supply took up smoking. Things had been going slowly downhill for a while -- the floppy drive was long gone, the trackpad was just a nuisance that would randomly select and cut things rather than merely point, the battery had just enough capacity for a safe shut down in the event of an AC outage -- but I was really hoping to be able to replace the computer before anything failed that would keep me from working altogether. With a working machine and a wireless card, I could always piggyback on someone else's internet service and at least keep up the appearance of having resources available to me. With that gone, though, I might be able to afford to triage my email once every week or two at an internet cafe, and I can only use my pay-as-you-go cell phone when I'm not at home (I basically had a choice between a place I could afford and one that had cell signal). Combined, that makes it awfully damned difficult to get a programming job. I can't even take work-at-home stuff unless there's enough money up front to cover the price of a power supply and comms costs, and I really can't imagine anyone being that stupid.

So that's it, folks. It's been a slice. I have enjoyed working on the Notes and Domino platform. And I have met some pretty fantastic people along the way, at least in the online sense of the word "met".

I don't know if I'll ever be in a position to rejoin the fold. I sold everything I had of value to keep up appearances, hoping that I would land a job before I ran out of resources. Well, that didn't quite work out the way I had hoped. I am literally using my last disposable dollar for the month of June to post this entry. I certainly won't be able to spend time and money composing or responding to blog entries or posting on the LDD forum on next month's budget unless I've found a job in the meantime. And when I do find work, it's not very likely to be Notes-related, or even computer-related, and probably won't be full-time or pay well enough to get me out of the hole I've dug for myself anytime soon.

Feel free to leave comments or send email, but be aware that I won't be able to read either right away. (Same thing goes for the PayPal panhandling button in the navigation area — while I'd be immensely appreciative of any spare change tossed my way, I probably wouldn't be aware of it for some time.) I'll try to keep you up to date, but I'm not sure that it'll be relevant reading for anyone who isn't fascinated by train wrecks. Sorry about that, folks.

Wednesday, April 12, 2006

Back in business

Yes, folks, it has been a little while. A number of things have been happening of late that have sort kept me in a perpetually flummoxed state of mind, and I haven't really known what to say.

First off, for those who are keen to know, my back problems seem to be behind me now. (That seems to be a pun, but isn't really.) I've lost about fifty pounds along the way (only my socks and shoes still fit from last year's wardrobe), and I'm feeling capable, almost to the point of chipper. There's still a bit of work to go before I can say I've achieved chipperousity, but we're getting there. In the meantime, I'm ready to get back to work.

Therein lies the problem. IT hiring is a Process, and I was really expecting that part of things to go slowly. What I wasn't expecting is that it would be so damned difficult to get an interim job to generate some kind of income. Even after dumbing my resume down so it wouldn't look too scary, I've found that the McJob world isn't too excited about my return to the workforce. Now, I can understand that managers might think a fellow like me might not stick around to make a career of things, but come on — these places depend on staff turnaround to avoid paying benefits. That's what the McJob is all about.

Oh, well — at least I'm getting some nibbles on the developer contract front. With any luck at all, one of them will pan out soon.

Thursday, February 23, 2006

About this SnTT thing....

From a message I found sitting on my desktop last night:

Thomas: Funny thing at work today... a coworker was looking for an answer to something, and her summary statement was "And Stan Rogers comes through for me again!"...

While I heartily applaud this community effort, I don't think this is one of the places you should be looking for a lot of contributions. Folks, it's not that I don't want to share what I've learned with the community at large, but that Thursday is just another day of the week, you know? For the past couple of weeks I've felt kind of guilty about staying out of the Thursday thing while I try to rescue a career from the wreckage of my life -- between the book, the suite and trying to launch a business, I really haven't had the time to launch into an article-length presentation or to try to think about a problem that hasn't already been solved in a hundred ways before.

Then I got the message you see above from our Duffbert. And I ran a quick agent that gave me this:

And suddenly, I don't feel quite so guilty anymore.

Friday, February 03, 2006

Want it in white?

Yellow may be the new black, but there are clean ways of getting that across, too.

Again, node editing in PSP was used here on shapes derived from the text. Specifically, the bottom of the "e" in the word "Yellow" was opened up so that the black outline wouldn't make it look like a Greek theta (θ). I really have to find a better font than Arial, though — it's a bit heavy-handed, even if one makes allowances for the small size of these graphics.

Tuesday, January 31, 2006

Email signature graphic, anyone?

I left a little room on the right for your own blurb -- or you can just chop it off. Your call.

Added: Oh, by the way -- if any of you are using PSP and want to modify the graphic to suit your own taste, there's no need to re-invent the wheel. Just mail me to request the multi-layer PSP-native file. The base graphic is much bigger than what you see here (3200 by 250), the text is still vector, and so forth. If you want to resize and move things around, modify the background colour, etc., you'll get a better final product than if you start with a small GIF. And big thanks to Ben Poole for supplying me with the Mac icon resource.

I'm still looking for a larger version of the current Lotus Software logo -- all I've been able to find is a small GIF. If anyone's got a large TIFF or EPS version (the sort of thing that's usually supplied for print ads), I'd be more than happy to have it. I've got some very good ideas for future versions, but I can't use the logo I have now (the biggest I could find on a web search), and trying to recreate something that's trademarked without the original font is, um, a wee bit difficult.

Added moreThanks to Bernd Hort, I now have the EPS logo I wanted for my birthday. Now to see if I can make it worth the trouble.

Monday, January 30, 2006

Yellow is the new black

Might as well join the party, what? Some nearly purty pitchers below:

Just a stripe YITNB Wallpaper -- 1600 by 1200 YITNB Wallpaper -- 1280 by 1024 YITNB Wallpaper -- 1024 by 768 -- GIF

For some truly maddening reason, Blogger has decided that these images should be JPEGs rather than the original GIFs except for the 1024 by 768 version. The conversion must happen based on the image size. I've mailed the originals off to Rocky and Rich, so if you want the version without JPEG artifacts, you may be able to get them somewhere else. If not, you can always mail me. Didn't hit your screen size? Want a custom insert somewhere? Same deal -- mail me.

Saturday, January 28, 2006

About Lotusphere 2006

Okay, so I wasn't there. In some respects, I'm thankful for that — I had the predicted back mutiny. But this was, apparently, not the Lotusphere to miss.

There is a level of excitement in the community that I have never seen before. Rocky blogged about the enthusiasm cycle, and I have to say that as somebody who entered the Notes and Domino world on or about the nadir (late 2000), I am glad to see both that IBM has made their commitment to the platform clear, and that the community has collectively stopped wondering when the other shoe is going to drop. It's not one or two voices crying in the wilderness anymore.

I don't know about the rest of you, but I can't wait for Hannover to go to public beta. It's hard to tell from a coupla screen shots and some third-hand reporting, but it looks like I'll finally be able to do in the Notes client a lot of what I've been working so hard to do on the web. Rich is not the only one turned on by activity-centric collaboration. That has been both the single biggest hurdle in getting my Domino application suite together, and the single biggest reason why it cannot be a Notes client application suite right now, today. Hannover and WCS look like they will provide a solution to a problem that is currently intractible in Notes.

I am sorry I missed some of the sessions. Jess's admin for dummies developers session got rave reviews, and Julian has been declared the new star presenter. Okay, I'm not going to list everything I missed -- you can look up the whole session list more easily than I can type it.

I just hope that, in a couple of thousand years' time, my descendants, should there be any, aren't gathered around a table sometime in January for a festival dinner that includes a wistful "next year in Orlando".

Friday, January 06, 2006

Visiting hours

Howzabout a couple of you Notes-type people dropping by Esther's place for a visit? Seems there's a reason she went silent back there. Bring some virtual flowers, and see if you can sneak in some e-candy.

Monday, January 02, 2006

Better late than never, eh?

Well, here we are at the start of another year. Or, rather, here I am. I assume that most of the rest of you were more-or-less functional over the traditional days of reflection (those being December 31 and January 1). I was somewhat less than my usual self. Or at least I hope I was. If this is the new "usual", I'd just as soon one of you had the decency to have me put down.

All that aside, though, 2005 was not such a bad year. Sure, I spent most of it in a "foreign" city that I never quite got around to seeing, doing work that I found rather less than fulfilling, and topping it all off with a mental and physical breakdown. But in a real sense, that was the good part. It forced me to take a long, hard look at where I was and what I was doing, and when I did that, it was blindingly obvious that I was headed in the wrong direction.

2006 will be a little different. I have to take it easy for physical reasons, so I will have a bit of time away from the keyboard. I just might spend some of that free time interacting with humans for a change. That's something I've done far too little of for the last little while. And there's something to be said for the occasional moment of peaceful, quiet solitude. Not loneliness, Lord knows I've had more than enough of that to last me a long time, but time alone to reflect and digest. Maybe throw the occasional dab of paint onto a canvas*.

On the technical front, I've still got a couple of things to tidy up. One of the nice things about new, independantly produced software is that I can take the time to do things right. I'm not facing a promised delivery date or a bunch of investors who are wondering where their profits are, so I don't have to leave any code or architecture behind that is merely good enough for now. I hope that when it is released, those of you who are in a position to know the difference can tell that I've had the chance to refine my approaches.

Is the old dog going to learn any new tricks this year? Only if I have to. I don't know about you folks, but I've put a lot of time in over the years learning new technologies, techniques and environments. It's all been book learnin' and putterin', though. My day job has always been the same old same old -- formulas, LotusScript, 1.1.x-vintage Java when absolutely necessary (since one can't always count on the next Notes guy really knowing Java, we've had to limit it to things that really needed to use something like the package). It's time to forget about learning new stuff and start using some of what I've already learned.

I'm ready to face the new year and the lessons and rewards it has to offer. And I wish you all a happy and rewarding 2006.

*I started a number of paintings last year, but they all became what we artists call "unfinished works". There is a limit to how long a painting can be left unworked before new paint stops adhering properly to old, and I'd hate to put anything out there, even as a gift, that's going to have to spend more time at the conservator's than on the wall.

Physically Impossible

If Stan loses 5 kilograms of mass over the course of 31 days and e=mc2, then what happened to the 168 GW/day I've been providing? Should I be getting a rebate from Hydro?