Skip to content. | Skip to navigation

Personal tools
Log in
Sections
You are here: Home weblog

Dominic Cronin's weblog

Breathing new life into the Tridion power tools

Posted by Dominic Cronin at Oct 12, 2011 10:25 PM |
Filed under: ,

A week or so ago we had the second Tridion MVP retreat. SDL's Tridion MVP programme is intended to encourage and inspire people to build a community among people who use the Tridion range of products. People who contribute visibly to building the community are recognised by the Most Valued Professional award. As one of this year's recipients, I was invited to spend a few days as Tridion's guest at a "retreat" in Portugal, also attended by the Community Builders (which is how Tridion extends the same recognition to it's own people who are active in the community).

It was an intense few days. The MVPs and Community Builders are social, driven and experts in their field; you couldn't wish for a more stimulating group of people to bounce ideas off, and the discussions ranged far and wide. These are real in-the-trenches practitioners, and it was great to hear the different ways people approach similar problems. Most of the attendees gave technical presentations, all of which were interesting both in themselves, and in the discussions they generated. The highlight though was our "technical assignment". The Power Tools will be familiar to most Tridion specialists as a set of custom pages which use the API to make some common tasks more manageable. Now that GUI extensibility is a first class citizen, it makes sense to review the old power tools and re-implement them in the new idiom. Chris Summers had prepared the ground before the retreat, and once he had led us through his analysis, the team began on a re-implementation of the batch image uploader.

The new power tools are now hosted on Google Code, and in addition to beginning the work of re-vitalisation, the MVPs and Community Builders also spent time to set up the basis of an open-source development team to continue the work once we all got back home.


I'd like to thank Tridion** for having invited me, not only for their splendid hospitality, but for the opportunity to spend some time off-line working with such stimulating people. As always - thanks to Nuno for pulling it all together, and thanks to the guys for being who they are!

 

** Yes - I know that I can't thank "Tridion", and that I should say "SDL Web Content Management Solutions", but everyone knows what I mean.

Announcing the Tridion Practice project

Posted by Dominic Cronin at Jul 23, 2011 05:30 PM |
Filed under: ,

There's much talk in some circles about the "Tridion community" and how we can promote community contributions and collaboration. I've been thinking about this, and I see a "gap in the market". Tridion's own building blocks exchange is great, and there are some outstanding community contributions on there. It's a showcase for people's coding talents and much more, but when it comes to making use of the building blocks, the most obvious way to do so is to download and install a given building block. Somehow the focus is not on showing people how they can raise their game at a coding level. The code is available, but it seems the focus is not on the code.

Other people have mentioned the need for a "cookbook" with examples and samples, but perhaps with the idea that this is something Tridion should produce. Of course, the documentation team at Tridion are putting huge efforts in to expanding the product documentation, and there are many examples to be seen there. That's not what I'm on about either.

You don't read product documentation to become inspired. (My humble apologies to the documentation team for my boorish bluntness.) But really - you don't. That's not what it's for. Then of course, there's other coders' blogs. Some of the Tridion bloggers are great, and you do sometimes get the sense of "Hey guys! This is cooool!! Did you know you could do this!!!! ???". That's something we need more of.

So you're probably ahead of me here, but the "gap" I talked of - what's missing - is a community collaboration cookbook. I'm announcing today that I have begun that project, and that anyone who wishes to help is welcome to join me. I believe that by collaborating, we can produce an on-line resource that combines real inspiration with sheer grinding competence. I've chosen to host it on "neutral" territory, at Google Code (https://code.google.com/p/tridion-practice/). I've also selected a liberal open source license (MIT) for the project, because I don't want anyone to have a moment's doubt as to whether it's OK to copy and paste the code straight in to their own work.

I've begun the cookbook with a single recipe: a template building block that factors some of the work out of building page templates. I hope this inspires you (OK - I'm smiling as I write this!) but if it doesn't please don't let that put you off. Maybe it's just not your thing, or you have a completely different approach to the same problem that works fine for you. In that case, maybe one of the other recipies will float your boat (assuming that more recipies will come).

Alternatively, you may look at it and think: "Gee, that would be great, except that such a thing irritates me, or I don't see why that part of the explanation is like that, or sheesh that code is ugly, or why aren't there any comments", or whatever. I know you're definitely ahead of me this time, but I'm going to say it anyway; this is where you come in. Community collaboration doesn't have to be about spending all your free evenings for a week producing a whole recipe yourself. You can help to improve it simply by sending in a couple of review comments, or whatever contribution suits you. Part of the reason for hosting it on neutral territory is to reinforce the idea that criticism is welcome.

Then again, you may not wish to contribute directly. That's also fine. Please make use of whatever you find on the Tridion Practice project. Enjoy! Be inspired! :-)

How does SDLTridion rate on J. Boye's 8 CMS features you are likely never to use?

Posted by Dominic Cronin at Mar 21, 2011 10:25 PM |
Filed under: ,

CMS industry analyst Janus Boye just posted "The 8 CMS features you are likely never to use". Generally, I agree with Janus's 8 points, and I definitely agree with the thinking behind it; that it's important to know which features are relevant to you when choosing a CMS. Here's my point-by-point take on how this applies to Tridion:

  • Workflow is something most customers ask for, but the vast majority don’t use it. Some use very simple approval processes, but that’s quite far from some of the wizz-bang visual workflow creation tools that I’ve seen in sales meetings.
  • Dominic: Indeed, the vast majority don't use it, and I'd be the first to advise them not to. Unless, of course, they need it. Working with workflow is always going to be more complex than working without it, but if you have a governance requirement which demands it, you have to have it. Tridion can meet this need, but I'm very glad most implementations don't need it.

  • Color coding changes; so that you can easily see what changed between versions (similar to Microsoft Word change tracking) is also a nice demo feature which may be used to differentiate a vendor, but in reality customers rarely use it.
  • Dominic: This feature is available in Tridion, and to be honest, I don't know how much it gets used by content workers. For developers, I'd say it gets used pretty regularly.

  • Microsoft Office integration; e.g. getting content from Word into CMS by directly clicking save from Word. SharePoint is the main exception to this, where several SharePoint customers actually use this feature, in particular for their intranet. Many press releases and other web content are born in Word, so this demo normally get the editors excited, but in reality Word integration often ends up as a copy and paste job.
  • Dominic: At one time or another, Tridion has had built-in integrations with Word. Over the years, people have realised that copy and paste works just as well, and they understand it better. Most implementations have some code in the format area XSLT for cleaning up the MSO "crap" that comes along with the paste. I don't know of anyone these days using an explicit integration, (or even if it's still on the truck at Tridion).

  • Future preview, e.g. how will my site look in a week for scheduling campaigns. A nice idea, in particular for digital marketing to be able to see how the site will look at the launch of the next campaign, but not exactly straight-forward to implement.
  • Dominic: You could implement this easily enough in Tridion with maybe some BluePrinting and an extra publication target. On the other hand, I have never, ever, been asked for this.

  • Back-end analytics; e.g. CMS statistics on usage of the administrative / editorial interface. Most web professionals struggle to find time to look at their website analytics, so while this can also be a persuasive sales demo, very few find the time to actually look at the CMS statistics.
  • Dominic: Totally agree. What? Like the content teams have got too much time on their hands. Gee, we musta been good! :-)

  • Advanced search; very few people use Google Advanced Search, and even fewer use and rely on the advanced search provided by the CMS. Some vendors have implemented advanced search features, often via integration with a 3rd party search engine. Once and again; this can be a great demo in particular for those organizations that feel they are drowning in content, but advanced search is simply not used.
  • Dominic: Down the years, Tridion has been a "best of breed" WCMS. The vision was always to have great APIs, so that Tridion itself could integrate well with other specialised software. (In other words: have some focus; stick to what you're good at.) It's not uncommon to feed data to a search engine at publish time, although just having your search appliance spider the site is probably just as good. So advanced search is often a requirement on Tridion projects, but you don't usually use Tridion itself for it. Of course, these days, we have advanced taxonomy support, and some search-like features might well be driven by categorising content on the back end.

  • A/B testing e.g. for headlines on-the-fly to find the headline that performs best. A/B testing is beginning to become more popular, but is mostly implemented outside the CMS by a 3rd party tool.
  • Dominic: Indeed - more usually done with a third-party tool. It's not core WCMS functionality. You can definitely integrate it via some of Tridion's out of the box features (e.g. target groups, customer characteristics) but it's not what a WCMS is for.

  • Frontpage editing, e.g. click directly on edit on a given page; another great demo which scores cheap usability points. Rarely implemented in the real world, except for small customers that don’t worry too much about staging environments, permissions and quality assurance processes.
  • Dominic: In Tridion, this is called SiteEdit, and these days it's pretty much an expected part of an implementation, perhaps even more so with big customers. I think Janus is missing the point a little here. You need a staging environment for this approach to make sense, because that's where you use it. You certainly don't miss out permissions and quality assurance. With Tridion SiteEdit, you still need the requisite permissions, and your quality assurance process stays intact.

     

    Keep the good stuff coming Janus. Maybe some other people will give a detailed breakdown for their favourite CMS.

    Republish from publish queue for Tridion 2011

    Posted by Dominic Cronin at Mar 21, 2011 07:20 PM |
    Filed under: ,

    Many of you will remember that absolutely the most popular Tridion extension ever :-) was my Republish from publish queue extension. It has just come to my attention that Bart Koopman has implemented pretty much the same thing for Tridion 2011.

    First of all - it's great to see this old chestnut get a new lease of life. Thanks Bart.

    Just in case anyone thinks Bart nicked my idea - well he didn't. We were both inspired by Hendrik Tredoux's idea which he posted on the ideas site way back. (Actually the "most voted" idea on the site to this day.) The irony of it is that although Hendrik was the one who posted the idea - he could probably have implemented it in his sleep, and the people of whom that is true would make a very short list indeed. Hopefully, with the great extensibility API in 2011, the list will be much longer.

    But back to the main theme. That's the great thing with a software community; we're all throwing ideas around and bouncing off each other. I love it. The really good part is that right now I'm just a tad too busy to re-implement this extension for 2011 (it was on my to-do list - honest!) and now I don't have to. I suppose I should at least find the time to download Bart's extension and kick the tyres. :-)

    Why Scrum when you can Maul?

    Posted by Dominic Cronin at Mar 20, 2011 12:25 PM |
    Filed under:

    I'm currently doing a Scrum project, and I suppose it's not surprising that we occasionally make reference to scrums in the sport of Rugby football. According to Wikipedia,  the use of the term in the context of development projects seems to have come from a 1986 article by Takeuchi and Nonaka describing a new approach to manufacturing cars, photocopiers, cameras and the like. Their analogy compares a relay race to a game of rugby, and although the term Scrum is used, it's really incidental to their point, which is that a "holistic" team-based approach can be more useful than the sequential approach which was widely in use.

    If you really want to know how Scrum came to mean what it means today in software development, you might follow some of the other links in the history section of the Wikipedia article. Suffice it to say that the use of the word Scrum in this context probably just refers to the concerted approach of a Rugby team rather than to the specifics of what a scrum is in Rugby.

    What most Rugby players today would call a scrum is something of a set piece which is used to re-start play after an infringement. The forwards of each team form a well-defined structure, putting their arms around each other and holding on to each other's clothing (binding-on). Three distinct rows of players on each side face their opponents and "engage" with them in a carefully controlled ritual.

    When you're creating software in a Scrum team, the idea is that each team-member should be able to dynamically shift their focus as necessary to make sure they are contributing to the most important goal of the team at any given moment. Of course, a Rugby scrum is a great analogy for general team-work and having a concerted goal, but it doesn't really say much about the need for team members to spot when a colleague needs help and come to their assistance. This is far better illustrated by another feature of Rugby - the Maul.

    A maul in Rugby takes place when a player wants to move forwards with the ball and is stopped by opposing players. From going forwards, he turns back towards his own supporting players, who bind on and drive the maul forwards. More players can run in from behind the maul, bind on and add their strength to overcoming the obstacle. All this happens without a whistle being blown or play being stopped. A team with the right training and discipline can generate and control a so-called "rolling maul", where a stable formation is maintained while driving forward and gaining sometimes significant territory.

    You can see how this works in this training video, and if you're interested a quick search of YouTube would yield many examples of teams putting the technique to use with great effect.

    I'm quite sure that Scrum as a brand is very well established, and I'm not proposing that anyone should start up a new software development method called Maul (although if you do, I want the credit!). Even so, the game of Rugby has more metaphors to offer than the simple one of team effort in a common cause. The scrum itself illustrates a formally organised set-piece, with the players fulfilling their specialist roles. In free-flowing play, such as mauling, the team expresses its ability to self-organise and respond to unforeseen conditions. You'll also see the benefits of cross-disciplinary working. When things turn chaotic, the relevant specialist probably isn't where he's most needed, so the players on the spot do whatever's needed. As that happens, the rest of the team are dynamically reorganising so that when the ball comes out, it will be into the hands of the right player.

    Of course, you can take this too far. Forthcoming posts: Should the Scrum-master role be re-branded as scrum-half? OK - never mind.

    Site now running on plone 4, and without disgusting blue-green skin

    Posted by Dominic Cronin at Feb 22, 2011 12:05 AM |

    Erm.. yes - I did know that the site previously looked awful. I'm not a web designer, and I never made any pretence of being one. Suffice it to say that I've upgraded to Plone 4 and in doing so, I'd have to make a conscious choice to bring the old skin with me. As my primary goal in messing with the site in the first place was to fix that ugliness, I've decided that - having got things working - the default plone look and feel is enough for now.

     

    Over the coming period, I intend to find some time to spend on re-skinning the site properly - in the meantime, at least it's readable.

     

    Hmm - wonder what it should look like!!?

    Using powershell to do useful things with XML lists from Tridion

    Posted by Dominic Cronin at Dec 30, 2010 09:55 PM |
    Filed under: , ,

    For a while now I've been trying to persuade anyone that would listen that Windows Powershell is great for interacting with the Tridion object model (TOM). What I mean by this is that you can easily use the powershell to instantiate and use COM objects, and more specifically, TOM objects. In this post, I'm going to take it a bit further, and show how you can use the powershell's XML processing facilities to easily process the lists that are available from the TOM as XML Documents. The example I'm going to use is a script to forcibly finish all the workflow process instances in a Tridion CMS. (This is quite useful if you are doing workflow development, as you can't upload a new version of a workflow while there are process instances that still need to be finished.)

    Although useful, the example itself isn't so important. I'm simply using it to demonstrate how to process lists. Tridion offers several API calls that will return a list, and in general, the XML looks very similar. I'm going to aim to finish all my process instances as a "one-liner", although I'm immediately going to cheat by setting up the necessary utility objects as shell variables:

    > $tdse = new-object -com TDS.TDSE
    > $wfe = $tdse.GetWFE()

    As you can see, I'm using the new-object cmdlet to get a TDSE object, specifying that it is a COM object (by default new-object assumes you want a .NET object). Then I'm using $tdse to get the WFE object which offers methods that list workflow items. With these two variables in place, I can attempt my one liner. Here goes:

    > ([xml]$wfe.GetListProcessInstances()).ListWFProcessInstances.Item | % {$tdse.GetObject($_.ID,2)} | % {$_.FinishProcess()}

    Well, suffice it to say that this works, and once you've run it (assuming you are an admin), you won't have any process instances, but perhaps we need to break it down a bit....

    If you start off with just $wfe.GetListProcessInstances(), the powershell will invoke the method for you, and return the XML as a string, which is what GetListProcessInstances returns. Just like this:

    > $wfe.GetListProcessInstances()
    <?xml version="1.0"?>
    <tcm:ListWFProcessInstances xmlns:tcm="http://www.tridion.com/ContentManager/5.0" xmlns:xlink="http://www.w3.org/1999/x
    link"><tcm:Item ID="tcm:24-269-131076" PublicationTitle="300 Global Content (NL)" TCMItem="tcm:24-363" Title="Test 1" T
    CMItemType="16" ProcessDefinitionTitle="Application Content Approval" ApprovalStatus="Unapproved" ActivityDefinitionTyp
    e="1" WorkItem="tcm:24-537-131200" CreationDate="2010-12-30T19:35:33" State="Started" Icon="T16L1P0" Allow="41955328" D
    eny="16777216"/><tcm:Item ID="tcm:24-270-131076" PublicationTitle="300 Global Content (NL)" TCMItem="tcm:24-570" Title=
    "Test 2" TCMItemType="16" ProcessDefinitionTitle="Application Content Approval" ApprovalStatus="Unapproved" ActivityDef
    initionType="1" WorkItem="tcm:24-538-131200" CreationDate="2010-12-30T19:36:04" State="Started" Icon="T16L1P0" Allow="4
    1955328" Deny="16777216"/></tcm:ListWFProcessInstances>
    

    OK - that's great - if you dig into it, you'll see that there is a containing element called ListWFProcessInstances, and that what it contains are some Item elements. All of this is in the tcm namespace, and each Item has various attributes. Unfortunately, the XML in this form is ugly and not particularly useful. Fortunately, the powershell has some built-in features that help quite a lot with this. The first is that if you use the [xml] cast operator, the string is transformed into a System.Xml.XmlDocument. To test this, just assign the result of the cast to a variable and use the get-member cmdlet to display it's type and methods:

    > $xml = [xml]$wfe.GetListProcessInstances()
    > $xml | gm
    

    (Of course, you don't type "get-member". "gm" is sufficient - most standard powershell cmdlets have consistent and memorable short aliases.)

    I won't show the output here, as it fills the screen, but at the top, the type is listed, and then you see the API of System.Xml.XmlDocument. (Actually you don't need a variable here, but it's nice to have a go and use some of the API methods.)

    All this would be pretty useful even if it stopped there, but it gets better. Because the powershell is intended as a scripting environment, the creators have wrapped an extra layer of goodness around XmlDocument. The assumption is that you probably want to extract some values without having to write XPaths, instantiate Node objects and all that other nonsense, so they let you access Elements and Attributes by magicking up collections of properties. Using the example above, I can simply type the names of the Elements and Attributes I want in a "dot-chain". For example:

    > ([xml]$wfe.GetListProcessInstances()).ListWFProcessInstances.Item[0].ID
    tcm:24-269-131076

    Here you can also see that I'm referencing the first Item element in the collection and getting its ID attribute. The tcm ID is returned. All this is great for exploring the data interactively, but be warned, there is a fly in the ointment. Behind the scenes, the powershell uses its own variable called Item to represent the members of the collections it creates. This means that whereas you ought to be able to type

    ([xml]$wfe.GetListProcessInstances()).ListWFProcessInstances

    and get some meaningful output, instead, you'll get an error saying:

    format-default : The member "Item" is already present.
        + CategoryInfo          : NotSpecified: (:) [format-default], ExtendedTypeSystemException
        + FullyQualifiedErrorId : AlreadyPresentPSMemberInfoInternalCollectionAdd,Microsoft.PowerShell.Commands.FormatDefaultCommand

    This is because Tridion's list XML uses "Item" for the element name, and it conflicts with powershell's own use of the name. It's an ugly bug in powershell, but fortunately it doesn't affect us much. Instead of saying "ListWFProcessInstances", just keep on typing and say "ListWFProcessInstances.Item" and you are back in the land of sanity.

    Apart from this small annoyance, the powershell offers superb discoverability, so for example, it will give you tab completion so that you don't even have to know the name of ListWFProcessInstances. If at any point you are in doubt as to what to type next, just stop where you are and pipe the result into get-member - all will be revealed.

    OK - back to the main plot. If you're with me this far, you have probably realised that

    ([xml]$wfe.GetListProcessInstances()).ListWFProcessInstances.Item

    will get you a collection of objects representing the Item elements in the XML. As you probably know, an important feature of powershell is that you can pipeline collections of objects, and that there is syntax built in for processing them. The % character is used as shorthand for foreach, and within the foreach block (delimited by braces), the symbol $_ represents the current item in the iteration. For example, we could write:

    > ([xml]$wfe.GetListProcessInstances()).ListWFProcessInstances.Item | % {$_.ID}
    

    and get the output:

    tcm:24-269-131076
    tcm:24-270-131076
    

    I'm sure you can see where this is going. We need to transform the collection of XML attributes: the IDs of the process instances, into a collection of TOM objects, so with a small alteration in the body of the foreach block, we have

    % {$tdse.GetObject($_.ID,2)}

    and then we can pipe the resulting collection of TOM objects into a foreach block which invokes the FinishProcess() method:

     

    % {$_.FinishProcess()}

    Of course, if you like really terse one-liners, you could amalgamate the last two pipeline elements so that instead of:

    > ([xml]$wfe.GetListProcessInstances()).ListWFProcessInstances.Item | % {$tdse.GetObject($_.ID,2)} | % {$_.FinishProcess()}

    we get:

    > ([xml]$wfe.GetListProcessInstances()).ListWFProcessInstances.Item | % {$tdse.GetObject($_.ID,2).FinishProcess()} 
    

    but in practice, you develop these one-liners by exploration, and if you want something really terse, you are more likely to write a more long-hand version, put it in your $profile, and give it an alias.

    As I said at the top - this is just an example. All the TOM functions that return XML lists can be treated in a similar manner. Generally all that changes is the name of the root element of the XML document, and as I have pointed out, this is easily discoverable.

    I hope this approach proves useful to you. If you have any examples of good applications, please let me know in the comments.

    A Happy New Year to you all.

    Dominic

    A new version of the Tridion developers' Powershell profile for SDL Tridion 2011

    Posted by Dominic Cronin at Nov 30, 2010 12:30 PM |
    Filed under: ,

    As I'm up at Tridion HQ for a few days for the 2011 boot camp, here's an updated version of the powershell profile for Tridion developers. Nothing really new, just that a couple of things got their names changed. (With the added bonus that the service names are now consistently named again!!).

     

    # http://www.leastprivilege.com/MyMonadCommandPrompt.aspx
    function prompt { "PS " + (get-location).Path + "`n> " }
    # http://www.interact-sw.co.uk/iangblog/2007/02/09/pshdetectelevation
    & {
      $wid=[System.Security.Principal.WindowsIdentity]::GetCurrent()
      $prp=new-object System.Security.Principal.WindowsPrincipal($wid)
      $adm=[System.Security.Principal.WindowsBuiltInRole]::Administrator
      $IsAdmin=$prp.IsInRole($adm)
      if ($IsAdmin)
      {
        (get-host).UI.RawUI.Backgroundcolor="DarkRed"
        clear-host
      }
    }
    # http://www.leastprivilege.com/AdminTitleBarForPowerShell.aspx
    $id = [System.Security.Principal.WindowsIdentity]::GetCurrent()
    $p = New-Object System.Security.Principal.WindowsPrincipal($id)
    if ($p.IsInRole([System.Security.Principal.WindowsBuiltInRole]::Administrator))
    {
     $Host.UI.RawUI.WindowTitle = "Administrator: " + $Host.UI.RawUI.WindowTitle
    }
    "rat"
    set-alias -name rat -value RestartAllTridion
    function RestartAllTridion
    {
    "### Restart All Tridion ###"
    "Stopping All Tridion CM services"
    $runningServices = service     TCMBCCOM, `
                    TCMIMPEXP, `
                    TcmPublisher, `
                    TCMSearchHost, `
                    TcmSearchIndexer, `
                    TcmServiceHost, `
                    TCMWorkflow `
            | where {$_.Status -eq "Running"}
    $runningServices | % {stop-service -force -InputObject $_}
    kt
    "Doing an IISRESET"
    iisreset
    "Starting Tridion services"
    # This script basically does best-effort, so we need a sick-bag in case a service is disabled or whatever 
    # (feel free to wire up the WMI stuff if you need to scratch this)
    &{
      trap [Exception] {}
      $runningServices | where { "Stopped", "StopPending" -contains $_.Status } | start-service
      }
    }
    "kt"
    set-alias -name kt -value killTridion
    function KillTridion {
    "Shutting down Tridion COM+ Application"
    $COMAdminCatalog = new-object -com COMAdmin.COMAdminCatalog
    $COMAdminCatalog.ShutdownApplication("SDL Tridion Content Manager")
    }

    Javascript for Tridion scripting

    Posted by Dominic Cronin at Oct 09, 2010 07:30 PM |
    Filed under: , ,

    During the recent Tridion MVP summit, the subject of Javascript came up. That's not surprising, as we were busy with building some GUI extensions to run on the SDL Tridion 2011 CTP, which is a pretty Javascript intensive activity. Most Tridion people reading this won't be too surprised if I say that Javascript isn't really in the comfort zone for many of us. We've done templating in VBScript, and although Tridion has always supported Javascript (or more strictly, JScript) for templating, we've always avoided it. Tridion has never shipped a JavaScript version of the default templates, and I suspect that's why, very early on, VBScript became the VHS to JavaScript's BetaMax. Most people accept that Betamax lost to VHS despite being technologically better, and for the sake of this discussion I'm going to ignore the dissenting opinions that have surfaced recently. In my view JavaScript is a superior technology to VBScript in many ways, but VBScript achieved critical mass in our world, and that's that. Or is it?

    Well to start with, I'm not about to suggest that anyone should start to write Tridion component templates in JavaScript. These days, if you're defecting from VBScript templating, you'll be going to a .NET language; probably C#, and I'm all in favour of that. On the other hand, for many of us, JavaScript clearly has a place in our future. If you're writing web applications these days, a solid grasp of js is essential, and as noted already, if you want to customise the Tridion GUI, you'll be elbow-deep in the stuff before you get anything useful done.

    For myself, I started using Javascript for simple update scripts and the like quite some time ago, and this mostly means running the scripts from the command line on the server using the cscript processor built-in to Windows. (These days, I also use PowerShell scripting for some of the more ad hoc work, but that's another story.)

    The first obvious point about the language is that it has several styles or idioms in common use. For those of you that are familiar with libraries such as JQuery, you'll know that the style can be very similar to what you might encounter in languages with a functional flavour. To gain expertise in this style, I'd strongly recommend jumping in the deep end with John Resig's "Learning Advanced Javascript". (It's a pretty deep deep end. Note to self: have another go soon, and try to understand it this time!)

    The other two idioms I'd mention are the object-oriented style, and what I can only describe as Microsoft style. For the object-oriented style, and how to achieve it, you can do no better than to view Doug Crockford's lecture series hosted at Yahoo. I suppose the best place to learn the Microsoft style is MSDN. In the days of the Atlas project, Microsoft tried to use JavaScript as just another layer in the ASP.NET stack, and with some success. For Tridion guys, this style is most notable because some of this flavour is to be seen in the Tridion GUI - or at least that's how it appears to me based on a fairly peremptory poke around. (I'd love someone to tell me if if I'm wrong. Really! Although comments won't show up, as I have to moderate to prevent spam. That's my only moderation criterion, though. If you take the trouble to write a real comment, I'll publish it.)

    I hope to return to to the subject of Tridion and Javascript in future posts, but for now, I'd just like to start with a simple example of why it works for me. But please be kind; I'm not a Resig or a Crockford.

    It's fairly often useful to be able to recurse through a folder structure and process each item within a given hierarchy. Usually, the processing involves two parts. Firstly, filtering: am I interested in this item or not? Secondly, for the items I'm interested in, I want to run some code, either to report on the item or to alter it in some way. This pattern comes up a lot, and for a lot of the small- to medium-scale jobs, the recurse-filter-process logic is a significant part of the work. Maybe the actual payload only amounts to a couple of lines of code. If you had a simple way to re-use the recurse-filter-process part, you could do a lot of quick-and-dirty jobs, erm... pretty quickly.

    In Javascript, I can paste the following function in to a template or a script file, and the recurse-filter-process part is done:

    function recurseItems( orgItem, filter, process){
      var items = orgItem.GetItems();
      var eItems = new Enumerator(items);
      for (;!eItems.atEnd();eItems.moveNext()){
        var itemType = eItems.item().ItemType;
        if (itemType === TDSDefines.ItemType.Folder || itemType === TDSDefines.ItemType.StructureGroup){
            recurseItems(eItems.item(), filter, process)
        } else {
          if (filter(eItems.item())) {
              process(eItems.item());
          }
        }
      }
    }

    I'm quite sure you could tidy this up a bit, but whatever... What makes this so straightforward to re-use is the fact that in JavaScript, functions are first-class objects, and it's very easy to pass functions as arguments to another function, and invoke them from within that function. The recurseItems() function expects to be passed an organizational Item as its first argument. (OK - as written, this won't work for categories, or for the sake of argument, publications, but bear with me...)

    The "filter", and "process" arguments are functions.

    Let's say I want to list all the components in a given hierarchy. I could write something like this (by which I mean this code isn't tested, it's for illustration purposes, right?):

    function isComponent(item){return item.ItemType === TDSDefines.ItemType.Component;}
    function outputTitle(item){WScript.Echo(item.Title + "\n";))
    var topFolder = tdse.GetObject("tcm:1-1234-2", TDSDefines.OpenMode.View);
    recurseItems(topFolder, isComponent, outputTitle);

    So with 4 lines of code, I've listed the items I'm interested in. Not bad, eh? OK - I cheated. But how? Take a look at the attached file TDSDefines.js. It's a port of the standard TDSDefines constants to JavaScript. The cheating part is that I instantiate the "tdse" object in there, which somewhat breaks the purity of having a TDSDefines file, but you're always going to want tdse, so wtf not? Anyway - this file is what allows me to type things like "TDSDefines.ItemType.Component", or TDSDefines.OpenMode.View, instead of 16, or 1. JavaScript lends itself very well to this kind of nested data structure, in ways that VBScript would struggle with.

    Assuming you are using the cscript host on your Tridion server, and that TDSDefines.js is in the same directory as your script, you'll need to use a couple of lines of code to import your "defines".

        var fso = new ActiveXObject("Scripting.FileSystemObject");
        eval(fso.OpenTextFile(fso.BuildPath(fso.GetParentFolderName(WScript.ScriptFullName), "TDSDefines.js"), 1).ReadAll());

    Inside Tridion, of course, TDSDefines.js just becomes a template building block, and gets included in the normal way.

    Funnily enough, I never got round to porting the default template code to JavaScript, and now, presumably, I never will. All the same, using JavaScript for this kind of work has allowed me to practice and get more familiar with the language, and to have a toolkit that allows for very, very fast creation of quick-and-dirty recursion scripts among others. It doesn't end there. JavaScript, or perhaps JSON, allows you to take a script-as-data approach which will get you to your desired result much quicker than, for example, having to write scripts that crunch through XML data files. Perhaps that would make a good subject for a future post.

    In the meantime, I hope this gives you yet another excuse to hone your JavaScript skills. Those skills will definitely come in useful.

    Engineers and bloody fools

    Posted by Dominic Cronin at Jul 22, 2010 11:35 PM |
    Filed under:

    "An engineer is a man who can do for five bob what any bloody fool can do for a quid."

    I was scratching around today  for the origin of this quote, or even how the exact quote ought to go. A bit of googling this evening solved it. It was Neville Shute - apparently! At least the quote comes from one of his books, and is probably something he might have said.

    http://www.nevilshute.org/Biography/alanbesterbio.php

    Just in case anyone's struggling with the "old money" - there were twenty bob in a quid. If anyone is struggling with the social conventions of the day, well I don't suppose women were included in the bloody fools either, but take in any case, take it to your local chapter... nothing to see here.