Skip to content. | Skip to navigation

Personal tools

Navigation

You are here: Home / weblog

Dominic Cronin's weblog

Showing blog entries tagged as: Powershell

Revisiting validateXml

Some time back in 2009 I blogged about validating Tridion's content delivery configuration files. It was a good idea then, and it's remained a good idea ever since. These days, we're dealing with SDL Web 8 and with the new micro-services architecture, you've got a lot of configuration files to get right. (On my fairly unambitious test system, running staging and live together, I just counted almost 80 configuration files.) Fortunately these seem to be reliably supported with schema files that are simply in each of the microservice folders that you copy during an installation. 

Back when I first wrote the ValidateXmlFile powershell function, I'd left it rather unfinished. It was good enough to let me do some validations and detect problems, but it had a significant flaw, in that if a schema file was not present at the location indicated by the noNamespaceSchemaLocation attribute, it would simply not bother with validation. Considering that we're using an XmlReader to do the validation, this is a pretty reasonable design decision - after all the main purpose is to read in the XML, and validation is perhaps a bit of a side-effect. Fair enough, but it's a nasty hole in our defences, so now that I'm revisiting the technique, I've beefed up the script a bit so that it checks that the location is present and that there's a file in the location. 

I've also made sure that the script does some pushd/popd to make sure that everything is nicely lined up when the location is relative to the file (which it generally is).

Here's the updated script

function ValidateXmlFile {
    param ([string]$xmlFile       = $(read-host "Please specify the path to the Xml file"))
	$xmlFile = resolve-path $xmlFile
    "==============================================================="
    "Validating $xmlFile using the schemas locations specified in it"
    "==============================================================="
    # The validating reader silently fails to catch any problems if the schema locations aren't set up properly
    # So attempt to get to the right place....
    pushd (Split-Path $xmlFile)

    try {
        $ns = @{xsi='http://www.w3.org/2001/XMLSchema-instance'}
	# of course, if it's not well formed, it will barf here. Then we've also found a problem
        # use * in the XPath because not all files begin with Configuration any more. We'll still 
        # assume the location is on the root element 
        $locationAttr = Select-Xml -Path $xmlFile -Namespace $ns -XPath */@xsi:noNamespaceSchemaLocation
        if ($locationAttr -eq $null) {throw "Can't find schema location attribute. This ain't gonna work"}

        $schemaLocation = resolve-path $locationAttr.Path
        if ($schemaLocation -eq $null) 
        {
            throw "Can't find schema at location specified in Xml file. Bailing" 
        }

        $settings = new-object System.Xml.XmlReaderSettings
        $settings.ValidationType = [System.Xml.ValidationType]::Schema
        $settings.ValidationFlags = $settings.ValidationFlags `
                -bor [System.Xml.Schema.XmlSchemaValidationFlags]::ProcessSchemaLocation
        $handler = [System.Xml.Schema.ValidationEventHandler] {
            $args = $_ # entering new block so copy $_
            switch ($args.Severity) {
                Error {
                    # Exception is an XmlSchemaException
                    Write-Host "ERROR: line $($args.Exception.LineNumber)" -nonewline
                    Write-Host " position $($args.Exception.LinePosition)"
                    Write-Host $args.Message
                    break
                }
                Warning {
                    # So far, everything that has caused the handler to fire, has caused an Error...
                    # So this /might/ be unreachable
                    Write-Host "Warning:: " + $args.Message
                    break
                }
            }
        }
        $settings.add_ValidationEventHandler($handler)
        $reader = [System.Xml.XmlReader]::Create($xmlfile, $settings)
        while($reader.Read()){}
        $reader.Close()

    }
    catch {
        throw
    }
    finally {
        popd         
    }
}

Of course, what you really want is to be able to verify all your configurations in one go. Once the script is in your powershell $profile, you can put together some fairly simple command-line-fu to take care of that. I have all my microservices in one directory, which I guess is a pretty common pattern, so all I had to do was CD over there and execute the following: 

gci -r -file -include *conf.xml | % {ValidateXmlFile $_}

By running this, I've also picked a couple of things that might be false positives. That aside, this is a real time saver if you're trying to solve issues. There's nothing like being able to eliminate a lot of the stupid typos from consideration all in one go. 

Using the Powershell to parse columns out of strings

Posted by Dominic Cronin at Jul 30, 2016 02:15 PM |
Filed under: ,

I've been kicking the tyres on Docker, and after a fairly short while I noticed that my list of containers was getting a little full. I decided to clean up, and after a quick look at the documentation, realised that I'd first have to run "docker ps -a" to get a list of all my containers, and then filter the list to get the ones I wanted to delete. (The alternative, was to read through the list, and manually execute "docker rm" on each one that I wanted to delete, and I'm far too lazy for that.)

Here's what the output from "docker ps -a" looks like

CONTAINER ID        IMAGE                  COMMAND                  CREATED             STATUS                         PORTS               NAMES
f7a3b9bb073c        dominiccronin/gentoo   "/bin/bash"              33 minutes ago      Exited (127) 33 minutes ago                        adoring_bell
2ec710c32df0        dominiccronin/gentoo   "/bin/bash"              16 hours ago        Exited (0) About an hour ago                       hungry_pare
7805ed925e51        gentoo/portage         "sh"                     16 hours ago        Created                                            portage
43c207846b56        dominiccronin/gentoo   "/bin/bash"              16 hours ago        Exited (127) 16 hours ago                          big_goodall
bbcc2e6d87d1        dominiccronin/gentoo   "/bin/bash"              18 hours ago        Exited (0) 18 hours ago                            infallible_mayer
f710c351291d        ubuntu:14.04           "C:/Program Files/Git"   8 months ago        Created                                            hopeful_archimedes
94acf6155aba        ubuntu:14.04           "C:/Program Files/Git"   8 months ago        Created                                            drunk_mahavira
e5bf3c39aa9e        ubuntu:14.04           "C:/Program Files/Git"   8 months ago        Created                                            desperate_pasteur
22ace2ca4ba1        ubuntu                 "C:/Program Files/Git"   8 months ago        Created                                            furious_brattain
a20746611b7b        67af10dd2984           "/bin/sh -c '/usr/gam"   9 months ago        Exited (0) 9 months ago                            berserk_goodall
398be811cb6a        67af10dd2984           "/bin/sh -c '/usr/gam"   9 months ago        Exited (0) 9 months ago                            fervent_torvalds
6363467ab659        67af10dd2984           "/bin/sh -c '/usr/gam"   9 months ago        Exited (0) 9 months ago                            grave_bardeen
b21bbf5103f0        67af10dd2984           "/bin/sh -c '/usr/gam"   9 months ago        Exited (0) 9 months ago                            ecstatic_feynman
56f1700ba2ca        67af10dd2984           "/bin/sh -c '/usr/gam"   9 months ago        Exited (0) 9 months ago                            elated_elion
0d41f9675f61        docker/whalesay        "cowsay boo-boo"         9 months ago        Exited (0) 9 months ago                            hopeful_brown
7309c5215e9f        docker/whalesay        "cowsay fooobar"         9 months ago        Exited (0) 9 months ago                            berserk_payne
23c1b894cec2        docker/whalesay        "whalesay fooobar"       9 months ago        Created                                            lonely_jones6
6a8c27a31740        docker/whalesay        "cowsay boo"             9 months ago        Exited (0) 9 months ago                            mad_jones
e5ca9dec78bc        docker/whalesay        "cowsay boo"             9 months ago        Exited (0) 9 months ago                            sleepy_ardinghelli
43c4d5c7a996        hello-world            "/hello"                 9 months ago        Exited (0) 9 months ago                            cocky_khorana
cbfe9e33af32        hello-world            "/hello"                 9 months ago        Exited (0) 9 months ago                            mad_leakey

The "hello, world" examples for Docker are all based on Docker's "theme animal", which is a whale, so if I could identify all the items where the image name contained the string "whale", I'd be on to a good thing. The only problem was that when you run a docker command like this in the powershell, all you get back is a list of strings. The structure of the columns is lost. A quick google showed that there is a Powershell module that might allow me to be even more lazy in the future but the thought of not being able to do it directly from the shell irritated me. So... here goes... this is how you do it:

docker ps -a | %{,@($_ -split ' {2,}')} | ?{$_[1] -match 'whale'} | %{docker rm $_[0]}

Yes, yes, I get it. That looks like the aftermath of an explosion in the top row department of a keyboard factory, so let's take it down a bit.

The interesting part is probably the second element in the pipeline. So after "docker ps -a" has thrown a list of strings into the pipeline, the second element is where I'm deconstructing the string into its constituent columns. The '%' operator is shorthand for 'foreach', so every line will be processed by the script block between the braces, and the line itself is represented by the built-in variable '$_'. (In the third, element you can see a similar construction but with a '?', so instead of a 'foreach', it's a 'where'.)

You can use a Regex with the split operator, and here I've used ' {2,}' to indicate that if there are 2 or more spaces together, I wish to use that as a column separator. Some of the columns are free text, with spaces in them, so I'm taking this pragmatic approach to avoid matching on a single space. Of course, there will be edge cases that break this, so I heartily recommend that you test the results first before actually doing 'docker rm'. Just replace the last element with something like "%{$_[1]}".

Having got the line split into columns, the next challenge is the PowerShell itself. If you throw anything that looks like a collection into the pipeline, it will get automatically unwrapped, and each item will be processed separately in the next block. So here, I'm wrapping the split in an array expression @(), and then preceding that with a comma. The comma operator is used to join a list of items into an array. Usually, this is something like 'a','b','c' - but it works just as well with a single operand, and so ,@(...) gets us an array containing an array. Then when it gets unwrapped by the pipeline, we have just the array containing the split fields. This means that in the third pipeline element we can filter on the value of $_[1] which is the IMAGE field. The fourth element actually invokes "docker rm" using the CONTAINER ID ($_[0]).

I've used Docker as the basis for this example. Just for the record, using the Docker Powershell module I mentioned,  I managed to remove all my Ubuntu containers like this:

Get-Container | ?{$_.Image -match 'bun'} | Remove-Container

 But as, I said, I'm just using Docker as an example. This PowerShell technique will also help you in many situations where there isn't a module available for the task at hand.

Checking your DXA/DD4T JSON in the SDL Web broker database

Over at the Indivirtual blog, I've posted about a diagnostic technique for use with the SDL Web broker database.

https://blog.indivirtual.nl/checking-dxadd4t-json-sdl-web-broker-database/

Enjoy!

Testing the SDL Web 8 micro-services

Posted by Dominic Cronin at May 13, 2016 11:43 AM |

Over at blog.indivirtual,nl I've just blogged about testing the SDL Web 8 microservices. 

Finding your way around the SDL Web 8 cmdlets

Posted by Dominic Cronin at Mar 30, 2016 08:55 PM |

In SDL Web 8, there are far more things managed via Windows PowerShell than there used to be in previous releases of the product. On the one hand, this makes a lot of sense, as the PowerShell offers a clean and standardised way to interact with various settings and configurations. Still, not everyone is familiar enough with the PowerShell to immediately get the most out of the cmdlets provided by the SDL modules. In fact, today, someone told me quite excitedly that they'd discovered the Get-TtmMapping cmdlet. My first question was "Have you run Get-Command on the SDL modules?"

The point is that with the PowerShell, quite a lot of attention is paid to discoverability. Naming conventions are specified so that you have a good chance of being able to effectively guess the name of the command you need, and other tools are provided to help you list what is available. The starting point is Get-Module. To list the modules available to you, you invoke it like this: 

get-module -listavailable

This will list a lot of standard Windows modules, but on your SDL Web 8 Content Manager server, you should see the following at the bottom of the listing: 


Directory: C:\Program Files (x86)\SDL Web\bin\PowerShellModules ModuleType Version Name ExportedCommands ---------- ------- ---- ----------------
Binary 0.0.0.0 Tridion.ContentManager.Automation {Clear-TcmPublicationTarget, Get-TcmApplicationIds, Get-Tc...
Binary 0.0.0.0 Tridion.TopologyManager.Automation {Add-TtmSiteTypeKey, Add-TtmCdEnvironment, Add-TtmCdTopolo...

This gives you the names of the available SDL modules. From here, you can dig in further to list the commands in each module, like this: 

get-command -module Tridion.TopologyManager.Automation

This gives you the following output: 

CommandType     Name                           ModuleName
----------- ---- ----------
Cmdlet Add-TtmCdEnvironment Tridion.TopologyManager.Automation
Cmdlet Add-TtmCdTopology Tridion.TopologyManager.Automation
Cmdlet Add-TtmCdTopologyType Tridion.TopologyManager.Automation
Cmdlet Add-TtmCmEnvironment Tridion.TopologyManager.Automation
Cmdlet Add-TtmMapping Tridion.TopologyManager.Automation
Cmdlet Add-TtmSiteTypeKey Tridion.TopologyManager.Automation
Cmdlet Add-TtmWebApplication Tridion.TopologyManager.Automation
Cmdlet Add-TtmWebsite Tridion.TopologyManager.Automation
Cmdlet Clear-TtmCdEnvironment Tridion.TopologyManager.Automation
Cmdlet Clear-TtmMapping Tridion.TopologyManager.Automation
Cmdlet Disable-TtmCdEnvironment Tridion.TopologyManager.Automation
Cmdlet Enable-TtmCdEnvironment Tridion.TopologyManager.Automation
Cmdlet Export-TtmCdStructure Tridion.TopologyManager.Automation
Cmdlet Get-TtmCdEnvironment Tridion.TopologyManager.Automation
Cmdlet Get-TtmCdTopology Tridion.TopologyManager.Automation
Cmdlet Get-TtmCdTopologyType Tridion.TopologyManager.Automation
Cmdlet Get-TtmCmEnvironment Tridion.TopologyManager.Automation
Cmdlet Get-TtmMapping Tridion.TopologyManager.Automation
Cmdlet Get-TtmWebApplication Tridion.TopologyManager.Automation
Cmdlet Get-TtmWebsite Tridion.TopologyManager.Automation
Cmdlet Import-TtmCdStructure Tridion.TopologyManager.Automation
Cmdlet Remove-TtmCdEnvironment Tridion.TopologyManager.Automation
Cmdlet Remove-TtmCdTopology Tridion.TopologyManager.Automation
Cmdlet Remove-TtmCdTopologyType Tridion.TopologyManager.Automation
Cmdlet Remove-TtmCmEnvironment Tridion.TopologyManager.Automation
Cmdlet Remove-TtmMapping Tridion.TopologyManager.Automation
Cmdlet Remove-TtmSiteTypeKey Tridion.TopologyManager.Automation
Cmdlet Remove-TtmWebApplication Tridion.TopologyManager.Automation
Cmdlet Remove-TtmWebsite Tridion.TopologyManager.Automation
Cmdlet Set-TtmCdEnvironment Tridion.TopologyManager.Automation
Cmdlet Set-TtmCdTopology Tridion.TopologyManager.Automation
Cmdlet Set-TtmCdTopologyType Tridion.TopologyManager.Automation
Cmdlet Set-TtmCmEnvironment Tridion.TopologyManager.Automation
Cmdlet Set-TtmMapping Tridion.TopologyManager.Automation
Cmdlet Set-TtmWebApplication Tridion.TopologyManager.Automation
Cmdlet Set-TtmWebsite Tridion.TopologyManager.Automation
Cmdlet Sync-TtmCdEnvironment Tridion.TopologyManager.Automation

I'm sure you can see immediately that this gives you a great overview of the possibilities - probably including some things you hadn't thought of. You can also see how they follow the standard naming conventions. But now that you know what commands are available, how do you use them? What parameters do they accept? What are they for? 

It might sound obvious, but indeed, the modules come with batteries included, including built-in help. So, for example, to learn more about a command, you can simply do this: 

help Get-TtmMapping

or if your Unix roots are showing, this does the same thing:

man Get-TtmMapping

The output looks like this: 

NAME
Get-TtmMapping
SYNOPSIS
Gets one or all Mappings from the Topology Manager.
SYNTAX
Get-TtmMapping [[-Id] <String>] [-TtmServiceUrl <String>] [<CommonParameters>]
DESCRIPTION
The Get-TtmMapping cmdlet retrieves a Mapping with the specified Id.
If Id parameter is not specified, list of all Mappings will be returned.
RELATED LINKS
Add-TtmMapping
Set-TtmMapping
Remove-TtmMapping
REMARKS
To see the examples, type: "get-help Get-TtmMapping -examples".
For more information, type: "get-help Get-TtmMapping -detailed".
For technical information, type: "get-help Get-TtmMapping -full".
For online help, type: "get-help Get-TtmMapping -online"

By using these few simple tools, you can accelerate your learning process and find the relevant commands easily and quickly. Happy hunting! 

Powershell 5 for tired old eyes

Posted by Dominic Cronin at Jan 02, 2016 04:55 PM |

With the release of Powershell 5, they introduced syntax highlighting. This is, in general, a nice improvement, but I wasn't totally happy with it, so I had to find out how to customise it. My problems were probably self-inflicted to some extent, as I think at some point I had tweaked the console colour settings. The Powershell is hosted in a standard Windows console, and the colours it uses are in fact the 16 colours available from the console. 

The console colours start out by default as fairly basic RGB combinations. You can see these if you open up the console properties (right-click on the title bar of a console window will get you there). In the powershell, these are given names - the powershell has its own enum for these, which maps pretty directly on to the ConsoleColor enumeration of the .NET framework. 

ConsoleColor

Description

Red 

Green Blue
Black

The color black.

0

0

0
Blue

The color blue.

0

0

255
Cyan

The color cyan (blue-green).

0

255

255
DarkBlue

The color dark blue.

0

0

128
DarkCyan

The color dark cyan (dark blue-green).

0

128

128
DarkGray

The color dark gray.

128

128

128
DarkGreen

The color dark green.

128

0

0
DarkMagenta

The color dark magenta (dark purplish-red).

128

0

128
DarkRed

The color dark red.

128

0

0
DarkYellow

The color dark yellow (ochre).

128

128

0
Gray

The color gray.

128

128

128
Green

The color green.

0

0

255
Magenta

The color magenta (purplish-red).

255

0

255
Red

The color red.

255

0

0
White

The color white.

255

255

255
Yellow

The color yellow.

255

255

0

In the properties dialog of the console these are displayed as a row of squares like this: 

and you can click on each colour and adjust the red-green-blue values. In addition to the "Properties" dialog, there is also an identical "Defaults" dialog, also available via a right-click on the title bar. Saving your tweaks in the Defaults dialog affects all future consoles, not only powershell consoles. 

In the Powershell, you can specify these colours by name. For example, the fourth one from the left is called DarkCyan. This is where it gets really weird. Even if you have changed the console colour to something else, it's still called DarkCyan. In the following screenshot, I have changed the fourth console colour to have the values for Magenta. 

Also of interest here is that the default syntax highlighting colour for a String, is DarkCyan, and of course, we also get Magenta in the syntax-highlighted Write-Host command. 

Actually - this is where I first had trouble. The next screenshot shows the situation after setting the colours back to the original defaults. You can also see that I am trying to change directory, and that the name of the directory is a String. 

My initial problem was that I had adjusted the Blue console color to have some green in it. This meant that a simple command such as CD left me with unreadable text with DarkCyan over a slightly green Blue background. This gave a particularly strange behaviour, because the tab-completion wraps the directory in quotes (making it a String token) when needed, and not otherwise. This means that as you tab through the directories, the directory name flips from DarkCyan to White and back again, depending on whether there's a space in it. Too weird...

But all is not lost - you also have control over the syntax highlighting colours. You can start with listing the current values using: 

Get-PSReadlineOption

And then set the colours for the various token types using Set-PSReadlineOption. I now have the following line in my profile

Set-PSReadlineOption -TokenKind String -ForegroundColor White

(If you use the default profile for this, you will be fine, but if you use one of the AllHosts profiles, then you need to check that your current host is a ConsoleHost.) 

Anyway - lessons learned... Be careful when tweaking the console colours - this was far less risky before syntax highlighting... and you can also fix the syntax highlighting colours if you need to, but you can only choose from the current console colours. 

New Tridion cookbook article: Recursive walk of Tridion tree

Posted by Dominic Cronin at Nov 20, 2015 01:10 PM |

I'm still trying to get the important parts of my Tridion developer summit talk online. With a code-based demo like that, sharing the slides is pretty pointless, so I'm putting the code on-line where ever it makes sense. So far this has been in the Tridion cookbook. Here's the latest

https://github.com/TridionPractice/tridion-practice/wiki/Recursive-walk-of-Tridion-tree

The thing that really triggered me to get this on-line was that someone had recently asked me if it was possible to query Tridion to find items that were local to a publication rather than shared from higher in the BluePrint. With the tree walk in place, this becomes almost trivial. (I'm not saying that there aren't better ways to get the list of items to process, but the tree walk certainly works.) 

So having got the items into a variable following the technique in the recipe, finding the shared items becomes as simple as:

$items | ? {$_.BluePrintInfo.IsShared}

But it might be more productive to throw all the items into a spreadsheet along with the relevant parts of their BluePrint Info:

$items | select Title, Id, @{n="IsShared";e={$_.BluePrintInfo.IsShared}}, `
@{n="IsLocalized";e={$_.BluePrintInfo.IsLocalized}} `
| Export-csv blueprintInfo.csv

Am I the only one that finds this fun? It's fun, right! :-)

New Tridion Cookbook article: Set up publication targets

Posted by Dominic Cronin at Nov 11, 2015 12:06 AM |
Filed under: , ,

In my "Talking to Tridion" session at the Tridion Developer Summit this year, one of the things I demonstrated was a script to automatically set up publication targets in Tridion. I'm now finally getting round to putting the talk materials on-line, and this one seemed a good candidate to become a recipe in the Tridion Cookbook. So if you are feeling curious, get yourself over to Tridion Practice and have a look. The new recipe is to be found here.

Managing the Tridion Core service powershell module as a git submodule

Posted by Dominic Cronin at Sep 06, 2015 01:50 PM |

N.B. Peter has changed the structure of the module (as he has every right to do, and I'm not complaining) - what this means is that this blog post is pretty useless other than as an exercise in poking at things. Maybe I'll figure it out, but in the meantime, assume that this technique won't work. 

I spend quite some time fiddling with various powershell implementations on my Tridion image. Whenever there's a place where I do experimental things like this, I run the risk that I'm going to break something so, at the very least, I usually do a quick "git init" in the directory, add the files and commit them. Then I have the benefit of version diffs and rollbacks if I need them. The next step comes when I realise that it's something I'm going to work on over a longer time, and that I really would prefer not to lose. At this point, I usually go on to my linux server and init a bare git, and then push from whereever I'm working.

Today I reached this second phase with the WindowsPowerShell directory of the Administrator account on my Tridion image. (It's about time, because I'm busy preparing a talk for the Tridion Developer summit in a couple of weeks, and well, losing my scripts would put a kink in my plans, to say the least.

In any case, I'd realised that I was running quite an old version of Peter Kjaer's Tridion-CoreService module. This module is the basis of pretty much any effort to use the Tridion core service from the powershell, and as this is the subject of my upcoming talk, I figured I should at least be doing my demos on the current version.

If you go to the github page for the module, you'll see that Peter's provided installation scripts which will help you to get up and running, but of course, if you have git installed, it makes just as much sense to clone the module directly. The only problem I had was that Modules are normally located in the Modules directory under the WindowsPowerShell directory. (You can add other locations to env:PSModulePath, but for what I wanted, that wasn't ideal.)

Fortunately, GIT is widely used for projects that make use of other projects, and there is very good support built in, by way of git-submodule. As my main git repository for the powershell stuff is directly in the WindowsPowerShell directory, all I needed to do was add Peter's module as a submodule with the right path.

In fact I just clicked on the menu option in Tortoise Git, but the basic command looks something like this:

git submodule add --name Modules/Tridion-CoreService git@github.com:pkjaer/tridion-powershell-modules.git Modules\Tridion-CoreService

With this in place, git understands that the Tridion-CoreService code belongs to Peter's module, and if he releases a new version, I can just pull. And of course, my own changes go in my own repository. Adding a submodule adds a .gitmodules file in your repository, so if I ever clone my WindowsPowerShell repository into another server, the location of Peter's repository can be retrieved, and the files pulled from there.

One word of warning. This is not the official release process for the Tridion-CoreService module. That is described here. As the module is pretty much a one-man affair, it's not unreasonable that there's only the master branch, so pulling from it is at your own risk. Personally I'm happy with the small risk, as it helps me to keep my development system a bit tidier - and heck - if it breaks, we'll fix it!

.

Getting the complete component XML

One of the basic operations that a Tridion developer needs to be able to do is getting the full XML of a Component. Sometimes you only need the content, but say, for example that you're writing an XSLT that transforms the full Component document - you need to be able to get an accurate representation of the underlying storage format (OK - for now let's just skate over the fact that different versions have different XML formats under the water)

In the balmy days of early R5 versions, this was pretty easy to do. The Tridion installation included a "protocol handler", which meant that if you just pasted a TCM URI into the address bar of your browser, you'd get the XML of that item displayed in the browser. This functionality was actually present so that you could reference Tridion items via the document() function in an XSLT, but this was a pretty useful side effect. OK... you had to be on the server itself, but hey - that's not usually so hard for a developer. If you couldn't get on the server, or you found it less convenient, another option was to configure the GUI to be in debug mode, and you'd get an extra button that opened up some "secret" dialogs that gave you access to, among other things, the XML of the item you had open in the GUI.

Moving on a little to the present day, things are a bit different. Tridion versions since 2011 have a completely different GUI, and XSTL transforms are usually done via the .NET framework, which has other ways of supporting access to "arbitrary" URIs in your XSLT. The GUI itself is built on a framework of supported APIs, but doesn't have a secret "debug" setting. However, this isn't a problem, because all modern browsers come fully loaded with pretty powerful debugging tools.

So how do we go about getting the XML if we're running an up-to-date version of Tridion? This question cropped up just a couple of days ago on my current project, where there's an upgrade from Tridion 2009 to 2013 going on. I didn't really have a simple answer - so here's how the complicated answer goes:

My first option when "talking to Tridion" is usually the core service. The TOM.NET API will give you the XML of an item directly via the .ToXml() methods. Unfortunately, someone chose not to surface this in the core service API. Don't ask me why? Anyway - for this kind of development work, you could use the TOM.NET. You're not really supposed to use the TOM.NET for code that isn't hosted by Tridion (such as templates) but on your development server, what the eye doesn't see the heart won't grieve over. Of course, in production code, you should take SDL's advice on such things rather more seriously. But we're not reduced to that just yet.

Firstly, a brief stop along the way to explain how we solved the problem in the short term. Simply enough - we just fired up a powershell and used it to access the good-old-fashioned TOM.COM. Like this:

PS C:\> $tdse = new-object -com TDS.TDSE
PS C:\> $tdse.GetObject("tcm:2115-5977",1).GetXml(1919)

Simple enough, and it gets the job done... but did I mention? We have the legacy pack installed, and I don't suppose this would work unless you have.

So can it be done at all with the core service? Actually, it can, but you have to piece the various parts together yourself. I did this once, a long time ago, and if you're interested, you can check out my ComponentFactory class over on a long lost branch of the Tridion power tools project. But that's probably too much fuss for day to day work. Maybe there are interesting possibilities for a powershell module to make it easier, but again.... not today.

But thinking about these things triggered me to remember the Power tools project. One of the power tools integrates an extra tab into your item popup, giving you the raw XML view. I'd been thinking to myself that the GUI API (Anguilla) probably had reasonably easy support for what we're trying to do, but I didn't want to go to the effort of figuring it all out. Never fear: after a quick poke around in the sources I found a file called ItemXmlTab.ascx.js, and there I found the following gem:

var xmlSource = $display.getItem().getXml();

That's all you need. The thing is... the power tool is great. It does what it says on the box, and as far as I'm concerned, it's an exceedingly good idea to install it on your development server. But still, there are reasons why you might not. Your server might be managed by someone else, and they might not be so keen, or you might be doing some GUI extension development yourself and want to keep a clear field of view without other people's extensions cluttering up the system. Whatever - sometimes it won't be there, and you'd still like to be able to just suck that goodness out of Tridion.

Fortunately - it's not a problem. Remember when I said most modern browsers have good development tools? We use them all the time, right? F12 in pretty much any browser will get you there - then you need to be able to find the console. Some browsers will take you straight there with Ctrl+Shift+J. So you just open the relevant Tridion item, go to the console and grab the XML. Here's a screenshot from my dev image.

Screencap of Console showing Tridion gui

So now you can get the XML of an item on pretty much any modern Tridion system without installing a thing. Cool, eh? Now some of you at the back are throwing things and muttering something about shouldn't it be a bookmarklet? Yes it should. That's somewhere on my list, unless you beat me to it.