Skip to content. | Skip to navigation

Personal tools

Navigation

You are here: Home / weblog

Dominic Cronin's weblog

The Tridion XSLT mediator, and page templates that don't template pages

Posted by Dominic Cronin at May 31, 2015 05:01 PM |
Filed under: , ,

My current customer is in the process of upgrading to Tridion 2013, and as part of that, switching over from the old XSLT mediator to the new. By the new one, I mean the one that was released by SDL as part of the 2013 product. The new one is very similar to the old one, and changing over hasn't created any significant problems, but I came across a quirk this week that I thought worth sharing.

The symptoms were pretty odd, and it took me a while to figure out what was happening. One of the page templates was emitting entirely the wrong content. In fact, instead of templating out a page, what I could see was more or less a copy of the XML of a component that was on the page. If you have the same problem, your symptoms may be different, depending on how you've written your XSLT template. It's also possible that you don't get any output at all, or something different.

Obviously, what the XSTL mediator does, is use your XSLT to transform an XML document. The question is - where does the XML document come from? The documentation states "By default, the XSLT template transforms the item called Component (if the Template Building Block is part of a Component Template) or Page (if part of a Page Template)."

This is not really as true as you might hope. It would actually be possible for the mediator to determine the type of item being rendered, but it doesn't do that. What it does by default is look in the package for an item called "Component", and if it finds one, that's what it transforms. Only then does it look to see if there's an item called "Page". If you ask me, that logic is backwards, as it's far more unlikely that a Component package has a Page item than the other way round. (There's even a standard Tridion TBB whose purpose is to add a Component item to a Page package.)

Anyway - all is not lost - the fix is quite simple. It's possible to override the default behaviour by specifying an input item. The old mediator used template building block parameters to control this behaviour, and IMNSHO that was a good choice in the context. With the new mediator, you do the same thing by using a processing instruction in the XSLT itself, or specifically by setting an attribute on the PI. In fact, if you look at the documentation, you can combine the two techniques, but that would surely be overkill for this.

So the bottom line is that in a page template, you should have something like this:

<?XsltMediator inputItemName="Page"?>

You may have noticed that I said "should". Yes - really... "should". Then you're clear that you want your page to render the Page item, even if at some point in the future, one of your colleagues decides to add an item called Component to the package. When the default behaviour is as non-obvious as this, it's wise not to rely on the defaults.

This was a public service announcement - or perhaps it was my penance for not spotting sooner what was going on.

Getting the complete component XML

One of the basic operations that a Tridion developer needs to be able to do is getting the full XML of a Component. Sometimes you only need the content, but say, for example that you're writing an XSLT that transforms the full Component document - you need to be able to get an accurate representation of the underlying storage format (OK - for now let's just skate over the fact that different versions have different XML formats under the water)

In the balmy days of early R5 versions, this was pretty easy to do. The Tridion installation included a "protocol handler", which meant that if you just pasted a TCM URI into the address bar of your browser, you'd get the XML of that item displayed in the browser. This functionality was actually present so that you could reference Tridion items via the document() function in an XSLT, but this was a pretty useful side effect. OK... you had to be on the server itself, but hey - that's not usually so hard for a developer. If you couldn't get on the server, or you found it less convenient, another option was to configure the GUI to be in debug mode, and you'd get an extra button that opened up some "secret" dialogs that gave you access to, among other things, the XML of the item you had open in the GUI.

Moving on a little to the present day, things are a bit different. Tridion versions since 2011 have a completely different GUI, and XSTL transforms are usually done via the .NET framework, which has other ways of supporting access to "arbitrary" URIs in your XSLT. The GUI itself is built on a framework of supported APIs, but doesn't have a secret "debug" setting. However, this isn't a problem, because all modern browsers come fully loaded with pretty powerful debugging tools.

So how do we go about getting the XML if we're running an up-to-date version of Tridion? This question cropped up just a couple of days ago on my current project, where there's an upgrade from Tridion 2009 to 2013 going on. I didn't really have a simple answer - so here's how the complicated answer goes:

My first option when "talking to Tridion" is usually the core service. The TOM.NET API will give you the XML of an item directly via the .ToXml() methods. Unfortunately, someone chose not to surface this in the core service API. Don't ask me why? Anyway - for this kind of development work, you could use the TOM.NET. You're not really supposed to use the TOM.NET for code that isn't hosted by Tridion (such as templates) but on your development server, what the eye doesn't see the heart won't grieve over. Of course, in production code, you should take SDL's advice on such things rather more seriously. But we're not reduced to that just yet.

Firstly, a brief stop along the way to explain how we solved the problem in the short term. Simply enough - we just fired up a powershell and used it to access the good-old-fashioned TOM.COM. Like this:

PS C:\> $tdse = new-object -com TDS.TDSE
PS C:\> $tdse.GetObject("tcm:2115-5977",1).GetXml(1919)

Simple enough, and it gets the job done... but did I mention? We have the legacy pack installed, and I don't suppose this would work unless you have.

So can it be done at all with the core service? Actually, it can, but you have to piece the various parts together yourself. I did this once, a long time ago, and if you're interested, you can check out my ComponentFactory class over on a long lost branch of the Tridion power tools project. But that's probably too much fuss for day to day work. Maybe there are interesting possibilities for a powershell module to make it easier, but again.... not today.

But thinking about these things triggered me to remember the Power tools project. One of the power tools integrates an extra tab into your item popup, giving you the raw XML view. I'd been thinking to myself that the GUI API (Anguilla) probably had reasonably easy support for what we're trying to do, but I didn't want to go to the effort of figuring it all out. Never fear: after a quick poke around in the sources I found a file called ItemXmlTab.ascx.js, and there I found the following gem:

var xmlSource = $display.getItem().getXml();

That's all you need. The thing is... the power tool is great. It does what it says on the box, and as far as I'm concerned, it's an exceedingly good idea to install it on your development server. But still, there are reasons why you might not. Your server might be managed by someone else, and they might not be so keen, or you might be doing some GUI extension development yourself and want to keep a clear field of view without other people's extensions cluttering up the system. Whatever - sometimes it won't be there, and you'd still like to be able to just suck that goodness out of Tridion.

Fortunately - it's not a problem. Remember when I said most modern browsers have good development tools? We use them all the time, right? F12 in pretty much any browser will get you there - then you need to be able to find the console. Some browsers will take you straight there with Ctrl+Shift+J. So you just open the relevant Tridion item, go to the console and grab the XML. Here's a screenshot from my dev image.

Screencap of Console showing Tridion gui

So now you can get the XML of an item on pretty much any modern Tridion system without installing a thing. Cool, eh? Now some of you at the back are throwing things and muttering something about shouldn't it be a bookmarklet? Yes it should. That's somewhere on my list, unless you beat me to it.

Hotfix rollups are the new Service Pack

I was recently surprised to learn that a Hotfix Rollup shipped from SDL Tridion is something quite different to what you'd expect from the title. For at least the last 10 years, and probably longer, the distinction between a hotfix and a service pack was very simple:

Service Pack

A collection of product improvements shipped between full version releases. The improvements would include bug fixes, and possibly new features, but never "breaking" changes. The intention was that customers should install the latest service pack for their current version. The service pack would have been thoroughly tested by R&D and would be the basis for on-going support until the next release.

Hotfix

If an issue was found in software in the field, a hotfix could be created to address this issue. There wouldn't be an installer - just some files and some instructions. Often a hotfix would be seen as suitable for any customer to install, but other hotfixes were riskier, and if you didn't have the problem, installing the hotfix would be a bad idea. Hotfixes were tested by customer support. The next service pack or full release would supersede any hotfix. In a reasonably thorough risk-management strategy, the standard play was to avoid taking hotfixes until you needed them. The official advice from Tridion as of 2011 was this:

IMPORTANT NOTE: Hotfixes are released at the discretion of SDL Tridion based on technical complexity, customer business requirements and schedules. Hotfixes are made and tested only for the described problem on a particular environment/configuration and therefore should only be installed if approved by SDL Tridion Customer Support. Hotfixes should be replaced as soon as possible by the subsequent service pack where the problem is fixed.

And then along came Hotfix Rollups...

Hotfix rollups

You might be forgiven for thinking that a hotfix rollup was, well a sort of erm... roll-up of hotfixes. A collection of hotfixes. A gathering together of a handy bunch of hotfixes to make life easier for the less risk-averse who like to install everything. (Like me, when I'm installing my own dev image. Love the handiness of it.) That's what the name means in any normal interpretation of the English language. The point here is that this is not what SDL Tridion mean when they say Hotfix Rollup. From discussions with various SDL people, it seems that they see a hotfix rollup as having the following characteristics:

  • It is not expected to cause any problems on your system and can safely be installed.
  • To this end, it has been tested by the relevant specialists in R&D
  • In the same way that you are expected to install a service pack, you are expected to install a hotfix rollup. Should further hotfixes become necessary, they will have the hotfix rollup as a dependency, not specific hotfixes. (This means that if you need that hotfix, you'll end up installing the hotfix rollup too, probably at a moment that you'd prefer to have chosen yourself.)

 

This is my best understanding at the current moment, but I am not aware of any formal communication from SDL that makes this clear, or otherwise updates the advice from 2011. Obviously, feel free to get formal confirmation via the usual channels

And as for you, SDL: your customers' risks are not your risks. You owe it to your customers to communicate correctly and in a timely way about this kind of thing. If anyone thought this would engender trust and confidence, that person was not thinking clearly. I wouldn't be saying this, but people out in the field often spend significant effort trying to balance risks like this, and it's in all our interests to make sure it goes well.

The Tridion Bookmarklet Challenge: We have a winner!

Posted by Dominic Cronin at Feb 08, 2015 09:33 PM |
Filed under: , ,

Those of you who have been following the Tridion bookmarklet challenge from the beginning will already know this, of course, but we have a winner. As I noted in my round up of the entries, Alexander Orlov (UI Beardcore) had, in addition to his own entry, the highest number of credits from other entrants for his assistance. It's fitting, therefore, that his entry found itself voted to the top of the heap, and came in as the clear winner with 15 votes at the close of play. Congratulations therefore to Alexander, not only for technical excellence, but also for his contribution to the true spirit of the competition.

The GIT command line is actually useful

Posted by Dominic Cronin at Jan 25, 2015 06:49 PM |

I tend to use a mix of GIT clients. Tortoise GIT is probably my main workhorse, as most of the time, managing my work via Windows Explorer is a pretty good model. Some people at the office are keen on Atlassian SourceTree, so I'll probably end up using that on projects where that's the standard tooling. Microsoft Visual Studio 2013 works pretty well with GIT, and there are plug-ins for earlier versions.

So why would I ever need a command line client, you might ask? And yet, I habitually also install GIT for Windows, (and trust me, it's not because I need another graphical client). I actually like having a command line client around, as I sometimes have the idea that some things might be simpler that way. I'm also scratching an itch to always learn a couple of ways of doing something, but realistically, I don't invest much effort in learning the command line, but I like to have it around, just in case... even at the cost of occasionally having to fend off accusations that I'm just trying to be some kind of geek..... <sigh>

And then, one day at work this week, it actually turned out that the command line was the only way to get the job done. I needed to clone a repository that was on a remote machine on a file share. Should be easy you'd think, but the remote location needed to be specified as a UNC address, and Tortoise GIT just wouldn't cope with the syntax.

So I reached for the command line, and after some cursory research it turned out what I needed to do was:

git remote add blah //SERVERNAME/c$/code/Blah

Once I'd done that, I could continue to use Tortoise, as the "blah" repository just showed up in the list...

So there you have it. I am completely justified and vindicated in keeping this software on my computers. :-) Maybe I'll practice a bit more and find some other good reasons to keep it around.

Publishing binary assets from SDL Tridion

Posted by Dominic Cronin at Jan 16, 2015 03:24 PM |

I've just published my latest article on the Indivirtual Blog. It's about Publishing binary assets from SDL Tridion. Please feel free to hop on over there and have a read.

Tridion bookmarklet challenge: the entries

Back in July, I issued the Tridion Bookmarklet challenge, and shortly afterwards set up a question on meta.tridion.stackexchange.com to manage the entries and the voting. The closing date for entries was 31 Dec 2014, so we've now had all the entries: a grand total of 13. Over the next month, we'll be collecting votes to see which entry, in the eyes of the community, was the best.

For the first few months of the challenge, there was very little activity. I even got a bit nervous that we'd have hardly any entries. I needn't have worried; it hotted-up nicely towards the end.

So what did we get for entries? Quite a mixed bag actually - and somewhere in there, quite a few interesting technical insights.

The first entry wasn't a bookmarklet at all. Roel Van Roozendaal chose to create a Chrome browser extension instead. This was a re-implementation of the delete message centre messages functionality that had kicked off the whole bookmarklet discussion in the first place, so nothing new from a functional point of view, but his article shows clearly how to wrap up the functionality in an extension. Certainly there's enough there to inspire you if you were thinking of doing something similar.

We also had an entry that was a bookmarklet, but didn't extend the GUI. Well it was already clear that the rules were pretty relaxed - we're more interested in useful, interesting, inspiring... even if I've been known for pedantry in other contexts :-) So go and check out the entry by Jan Horsman (Jan H), which makes it simple to log in to the Tridion documentation site.

We had multiple entries from several people. Robert Curlette (robrtc) entered, Count Items in View, and later Get Schema Id and Title. The first of these really showed the community process in action, with Robert reaching out with a question on Tridion Stack Exchange (TREX). In his second entry, Robert also bravely 'fessed up to his ignorance on TREX and was rewarded. (See how that works, people? He now knows more than he did, and he's helped the rest of us. We need askers as well as answerers - and fortunately Robert does well in both categories.)

Chris Morgan came up with a couple: Get WebDAVUrl and Open Schema. Both of these are going to be useful to Tridion developers doing their daily work.

Frank Taylor (paceaux) also made two entries. He started with PubUp, which simply lets you navigate to where an item is in the BluePrint. Once he'd entered PubUp, the inevitable community interaction took hold, and he was diving into the Anguilla framework with the help of Nuno Linhares. Lo and behold, out of this process, he was inspired also to enter his AnguillaMediator bookmarklet.

Pankaj Gaur created a bookmarklet that would display the file information of a multimedia component. That's not displayed by default in the Tridion GUI, so there will definitely be people who want this. He also did a bookmarklet that would localize an item.

Not everyone did two :-) Rob Stevenson-Leggett did one to rename a Tridion item, (which was always an awkward thing to do), and Jonathon Williams created Get Creation Info. Don't worry guys - one entry is quite enough, and these are both useful offerings and even with two entries, only one can win.

Alexander Orlov (UI Beardcore) also made a single entry, with his Multiple Upload bookmarklet. Alexander is also notable as the person credited with the most "assists" in the challenge. Several people have made use of what we're coming to know as the Beardcore hack to acquire a reference to the Anguilla API in the Tridion GUI.

So thanks to you all for taking the time and trouble to create these bookmarklets and take part in the challenge. It's been great to see the spirit of friendly rivalry, with people learning from each other and even helping to improve competing efforts. I don't know who is going to win, but every single one of you has helped the community by spreading your knowledge and expertise.

Vim Windows weirdnesses

Posted by Dominic Cronin at Dec 22, 2014 10:43 PM |
Filed under: , ,

This is just a quick note-to-self to remind me of the stuff I always forget when installing plugins and the like for Vim on a Windows machine. So of course this means gVim. The confusing thing is always that the documentation for everything refers to your ~/.vim directory. And - you haven't got one. Here's the note to self.

Your ~/.vim directory is called vimfiles

And ~ is probably somewhere like C:\Users\dominic - your .vimrc will be there too, so you can find it by running vim and doing

:echo $MYVIMRC

Adding publications to Tridion from the core service.

Posted by Dominic Cronin at Nov 06, 2014 07:44 PM |

Just a heads-up about my new blog post on the Indivirtual blog. This time it's about Adding publications to Tridion from the core service.

Default namespaces (again - this time in XSLT and XPath)

Posted by Dominic Cronin at Sep 07, 2014 04:22 PM |
Filed under: , ,

For a long time, the default XSLT for a Rich Text Format area (RTF) in Tridion began with a fairly standard namespace declaration, like this:

<xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform">
It's very normal to begin an XSLT this way: the XSLT elements are in the namespace referenced by the "xsl" prefix, and you're free to use the default namespace for output that you want to define. In an RTF XSLT, you'll generally want to be outputting HTML elements without a namespace. (Often these HTML fragments will be embedded in a document that gives them a namespace, but that's another story.) Indeed, the document you are transforming will also be without a namespace. (The native storage of an RTF is in the XHTML namespace, but that's removed before the XSLT is invoked.)
So normally, you can write templates with match rules that simply address an HTML element by name - for example:
<xsl:template match="b">
   <strong>
     <xsl:apply-templates/>
  </strong>
</xsl:template>

I've just checked on my Tridion 2013 system, and the default RTF XSLT is done in this style, but I've seen examples in the field that use a rather different, and perhaps bizarre style to achieve the same thing. (I have a suspicion that there was a version of Tridion that used this style, or maybe it was something to do with an upgrade that converted to it, but I have no direct evidence. Maybe there's another explanation as to why we see this in the field.)

This other style I'm referring to looks quite different. It starts with a namepace declaration like this:

<stylesheet xmlns="http://www.w3.org/1999/XSL/Transform" version="1.0"> 

As you see, the XSLT namespace is used as the default, enabling the template code to be slightly terser. The same template would look like this:

<template match="b">
   <strong xmlns="">
     <xsl:apply-templates/>
  </strong>
</template>

When I first saw this kind of code I was almost pulling my hair out. I actually had to manually execute a test XSLT to prove to myself that it would actually work. And it does.

What was upsetting me? Well was convinced that the match should fail. I mean surely, you'd have to do something like:

<template match="empty:b" xmlns:empty="">
   <strong xmlns="">
     <xsl:apply-templates/>
  </strong>
</template>
See what I mean? That b shouldn't match.... right?
Well anyway - it turns out I was wrong. XSLT has explicit provision for this case. Looking at the recommendation for XSLT, we see this:

the set of namespace declarations are those in scope on the element which has the attribute in which the expression occurs; this includes the implicit declaration of the prefix xml required by the the XML Namespaces Recommendation [XML Names]; the default namespace (as declared by xmlns) is not part of this set

So XSLT doesn't let the XPath see the default namespace, but it wouldn't matter, because XPath would ignore it: From the XPath recommendation:

A QName in the node test is expanded into an expanded-name using the namespace declarations from the expression context. This is the same way expansion is done for element type names in start and end-tags except that the default namespace declared with xmlns is not used: if the QName does not have a prefix, then the namespace URI is null (this is the same way attribute names are expanded).

So in an XSLT, the normal namespace weirdness that we're used to for attributes applies to elements as well.

Well, that explains that then. I think I'll be sticking to the more conventional approach though.