Skip to content. | Skip to navigation

Personal tools
Log in
Sections
You are here: Home weblog

Dominic Cronin's weblog

Why should your Tridion GUI extension 'model' have it's own service layer on top of the core service?

Posted by Dominic Cronin at Aug 08, 2012 08:49 PM |

I've spent some time lately looking at the architecture for the next phase of implementing the Component Synchroniser for the Tridion Power Tools project. This meant looking through most of the other power tools, because, of course, they are a great resource for anyone wanting to build a Tridion GUI extension. The down side of this is sometimes, reading the code, you can observe a pattern being used, but it can be hard to tell why this would be a good or bad design. I'd noticed that the model of pretty much every power tool is implemented as a WCF service, often acting as a very thin wrapper around the core service client. As I was wondering about this, I posed the following question in the private chat channel used by the Tridion MVPs and community builders:

So if you're doing a gui extension, is it reckoned to be bad form to access the core service directly from your aspx. Or is it just coincidence that most (all?) of the power tools have an additional service layer?

This was enough to spark quite an informative debate, and in keeping with the spirit of the thing, I promised to write it up for general consumption. The contributors were Frank van Puffelen, Nuno Linhares, Peter Kjaer and Jeremy Grand-Scrutton.

The general feeling was that you ought to stick to the pattern I had observed in the power tools. The reasons were as follows:

  • Ease of coding - The Anguilla framework can automatically generate a JavaScript proxy for your service.
  • Maintainability - if you talk directly from JavaScript to the core service, you will not get any compile-time checks, whereas your own service layer would be built in .NET and would therefore have some defences against future (likely) changes in the core service client.
  • Consistency with the rest of the CME - In the CME, views are typically considered fully client-side. Where the CME does use Aspx, this is only to generate some HTML on the server, and typically not to for implementation logic.
  • Known issues -  ASP.NET postbacks in Anguilla views have been known to cause problems for some people, since e.g. popups won't keep their state through a postback (or an F5 press for that matter).

 

According to  these criteria, the actual design I was looking at could use the core directly, as my idea was to generate some HTML. In practice, it turns out that there are other reasons to stick with the extra service layer. Even so, I'm very glad I asked the question, and that the answers I got were so informative. Thanks guys!

A thing of beauty is a joy for ever

Posted by Dominic Cronin at Jul 09, 2012 11:30 PM |

So - I've been using the Windows Powershell for the odd bit of Tridion work. You knew that.

And you probably also knew that very often the Tridion API hands you back a string representing an XML document, and that it's very convenient to "cast" this to a .NET XmlDocument using the [xml] operator. Just search on this blog for "powershell" and you'll find enough examples. But still - there's a missing piece in the puzzle. So today I wanted to look at the output from the .GetTridionWebSchemaXMl() method on a Tridion Object Model Schema object. (Don't worry - I am weaning myself off the TOM; I wanted to compare this API with the ReadSchemaFields() method on the core service client API.)

Anyway - for what it's worth, here's what the raw string looks like:

> $tdse.GetObject("tcm:21-509-8",1).GetTridionWebSchemaXML(1919,$true)
<tcm:TridionWebSchema ID="tcm:21-509-8" IsEditable="false" xmlns:tcm="http://www.tridion.com/ContentManager/5.0"><tcm:C
ontext xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:i="http://www.w3.org/2001/XMLSchema-instance" xmlns:transform-e
xt="urn:tridion:transform-ext"><tcm:Publication xlink:type="simple" xlink:title="Synchroniser tests" xlink:href="tcm:0-
21-1" /><tcm:OrganizationalItem xlink:type="simple" xlink:title="TestSchemaOne" xlink:href="tcm:21-50-2" /></tcm:Contex
t><tcm:Info xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:i="http://www.w3.org/2001/XMLSchema-instance" xmlns:transf
orm-ext="urn:tridion:transform-ext"><tcm:LocationInfo><tcm:WebDAVURL>/webdav/Synchroniser%20tests/Building%20Blocks/Tes
tSchemaOne/TestSchemaCategories.xsd</tcm:WebDAVURL><tcm:Path>\Synchroniser tests\Building Blocks\TestSchemaOne</tcm:Pat
h></tcm:LocationInfo><tcm:BluePrintInfo><tcm:OwningPublication xlink:type="simple" xlink:title="Synchroniser tests" xli
nk:href="tcm:0-21-1" /><tcm:IsShared>false</tcm:IsShared><tcm:IsLocalized>false</tcm:IsLocalized></tcm:BluePrintInfo><t
cm:VersionInfo><tcm:Version>3</tcm:Version><tcm:Revision>0</tcm:Revision><tcm:CreationDate>2012-07-07T18:28:23</tcm:Cre
ationDate><tcm:RevisionDate>2012-07-09T20:18:21</tcm:RevisionDate><tcm:Creator xlink:type="simple" xlink:title="TRIDION
DEV\Administrator" xlink:href="tcm:0-11-65552" /><tcm:Revisor xlink:type="simple" xlink:title="TRIDIONDEV\Administrator
" xlink:href="tcm:0-11-65552" /><tcm:ItemLock Title="No lock" Type="0" /><tcm:IsNew>false</tcm:IsNew></tcm:VersionInfo>
<tcm:AllowedActions><tcm:Actions Allow="1173513" Deny="102" Managed="0" /></tcm:AllowedActions></tcm:Info><tcm:Data xml
ns:xlink="http://www.w3.org/1999/xlink" xmlns:i="http://www.w3.org/2001/XMLSchema-instance" xmlns:transform-ext="urn:tr
idion:transform-ext"><tcm:Title>TestSchemaCategories</tcm:Title><tcm:Description>TestSchemaCategories</tcm:Description>
<tcm:Purpose>Component</tcm:Purpose><tcm:NamespaceURI>uuid:f14d60ed-0f7c-4d1f-a2e3-97d1dfeb1a1f</tcm:NamespaceURI><tcm:
RootElementName>Content</tcm:RootElementName><tcm:Fields><tcm:KeywordField><tcm:Name>ColoursOne</tcm:Name><tcm:Descript
ion>ColoursOne</tcm:Description><tcm:MinOccurs>1</tcm:MinOccurs><tcm:MaxOccurs>unbounded</tcm:MaxOccurs><tcm:Category x
link:type="simple" xlink:title="Colours" xlink:href="tcm:21-59-512" /><tcm:Size>1</tcm:Size><tcm:List Type="tree" /><tc
m:ExtensionXml xmlns="http://www.sdltridion.com/ContentManager/R6" /></tcm:KeywordField><tcm:SingleLineTextField><tcm:N
ame>Animals</tcm:Name><tcm:Description>Test field with locally declared list</tcm:Description><tcm:MinOccurs>1</tcm:Min
Occurs><tcm:MaxOccurs>1</tcm:MaxOccurs><tcm:Size>1</tcm:Size><tcm:List Type="select"><tcm:Entry>Horse</tcm:Entry><tcm:E
ntry>Haddock</tcm:Entry><tcm:Entry>Weasel</tcm:Entry></tcm:List><tcm:ExtensionXml xmlns="http://www.sdltridion.com/Cont
entManager/R6" /></tcm:SingleLineTextField></tcm:Fields><tcm:MetadataFields /><tcm:AllowedMultimediaTypes /><tcm:Compon
entProcess xlink:type="simple" xlink:title="" xlink:href="tcm:0-0-0" /></tcm:Data></tcm:TridionWebSchema>

Yeah - erm ... Okaayyyy. Great.

OK - so how about we do the cast?

 

> [xml]$tdse.GetObject("tcm:21-509-8",1).GetTridionWebSchemaXML(1919,$true)
TridionWebSchema
----------------
TridionWebSchema

 

Well - at least you can read it.. but seriously - also not super helpful if you just want to scan the XML with good-old-fashioned human eyeballs.

So what can we do? Well I got to the point where I actually typed the following into Google:

powershell pretty print xml

and the first hit was on Keith Hill's blog.  Keith had written a nice little function that looks like this:

function XmlPrettyPrint([string]$xml) {
    $tr = new-object System.IO.StringReader($xml)
    $settings = new-object System.Xml.XmlReaderSettings
    $settings.CloseInput = $true
    $settings.IgnoreWhitespace = $true
    $reader = [System.Xml.XmlReader]::Create($tr, $settings)
    $sw = new-object System.IO.StringWriter
    $settings = new-object System.Xml.XmlWriterSettings
    $settings.CloseOutput = $true
    $settings.Indent = $true
    $writer = [System.Xml.XmlWriter]::Create($sw, $settings)
    
    while (!$reader.EOF) {
        $writer.WriteNode($reader, $false)
    }
    $writer.Flush()
    
    $result = $sw.ToString()
    $reader.Close()
    $writer.Close()
    $result
}

A minute later, this function was in my Powershell profile (and I slightly altered the name and added an alias) so now I can do the following:

> ppx ([xml]$tdse.GetObject("tcm:21-509-8",1).GetTridionWebSchemaXML(1919,$true)).OuterXml
<?xml version="1.0" encoding="utf-16"?>
<tcm:TridionWebSchema ID="tcm:21-509-8" IsEditable="false" xmlns:tcm="http://www.tridion.com/ContentManager/5.0">
  <tcm:Context xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:i="http://www.w3.org/2001/XMLSchema-instance" xmlns:transform-ext="urn:tridion:transform-ext">
    <tcm:Publication xlink:type="simple" xlink:title="Synchroniser tests" xlink:href="tcm:0-21-1" />
    <tcm:OrganizationalItem xlink:type="simple" xlink:title="TestSchemaOne" xlink:href="tcm:21-50-2" />
  </tcm:Context>
  <tcm:Info xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:i="http://www.w3.org/2001/XMLSchema-instance" xmlns:transform-ext="urn:tridion:transform-ext">
    <tcm:LocationInfo>
      <tcm:WebDAVURL>/webdav/Synchroniser%20tests/Building%20Blocks/TestSchemaOne/TestSchemaCategories.xsd</tcm:WebDAVURL>
      <tcm:Path>\Synchroniser tests\Building Blocks\TestSchemaOne</tcm:Path>
    </tcm:LocationInfo>
    <tcm:BluePrintInfo>
      <tcm:OwningPublication xlink:type="simple" xlink:title="Synchroniser tests" xlink:href="tcm:0-21-1" />
      <tcm:IsShared>false</tcm:IsShared>
      <tcm:IsLocalized>false</tcm:IsLocalized>
    </tcm:BluePrintInfo>
    <tcm:VersionInfo>
      <tcm:Version>3</tcm:Version>
      <tcm:Revision>0</tcm:Revision>
      <tcm:CreationDate>2012-07-07T18:28:23</tcm:CreationDate>
      <tcm:RevisionDate>2012-07-09T20:18:21</tcm:RevisionDate>
      <tcm:Creator xlink:type="simple" xlink:title="TRIDIONDEV\Administrator" xlink:href="tcm:0-11-65552" />
      <tcm:Revisor xlink:type="simple" xlink:title="TRIDIONDEV\Administrator" xlink:href="tcm:0-11-65552" />
      <tcm:ItemLock Title="No lock" Type="0" />
      <tcm:IsNew>false</tcm:IsNew>
    </tcm:VersionInfo>
    <tcm:AllowedActions>
      <tcm:Actions Allow="1173513" Deny="102" Managed="0" />
    </tcm:AllowedActions>
  </tcm:Info>
  <tcm:Data xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:i="http://www.w3.org/2001/XMLSchema-instance" xmlns:transform-ext="urn:tridion:transform-ext">    <tcm:Title>TestSchemaCategories</tcm:Title>
    <tcm:Description>TestSchemaCategories</tcm:Description>
    <tcm:Purpose>Component</tcm:Purpose>
    <tcm:NamespaceURI>uuid:f14d60ed-0f7c-4d1f-a2e3-97d1dfeb1a1f</tcm:NamespaceURI>
    <tcm:RootElementName>Content</tcm:RootElementName>
    <tcm:Fields>
      <tcm:KeywordField>
        <tcm:Name>ColoursOne</tcm:Name>
        <tcm:Description>ColoursOne</tcm:Description>
        <tcm:MinOccurs>1</tcm:MinOccurs>
        <tcm:MaxOccurs>unbounded</tcm:MaxOccurs>
        <tcm:Category xlink:type="simple" xlink:title="Colours" xlink:href="tcm:21-59-512" />
        <tcm:Size>1</tcm:Size>
        <tcm:List Type="tree" />
        <tcm:ExtensionXml xmlns="http://www.sdltridion.com/ContentManager/R6" />
      </tcm:KeywordField>
      <tcm:SingleLineTextField>
        <tcm:Name>Animals</tcm:Name>
        <tcm:Description>Test field with locally declared list</tcm:Description>
        <tcm:MinOccurs>1</tcm:MinOccurs>
        <tcm:MaxOccurs>1</tcm:MaxOccurs>
        <tcm:Size>1</tcm:Size>
        <tcm:List Type="select">
          <tcm:Entry>Horse</tcm:Entry>
          <tcm:Entry>Haddock</tcm:Entry>
          <tcm:Entry>Weasel</tcm:Entry>
        </tcm:List>
        <tcm:ExtensionXml xmlns="http://www.sdltridion.com/ContentManager/R6" />
      </tcm:SingleLineTextField>
    </tcm:Fields>
    <tcm:MetadataFields />
    <tcm:AllowedMultimediaTypes />
    <tcm:ComponentProcess xlink:type="simple" xlink:title="" xlink:href="tcm:0-0-0" />
  </tcm:Data>
</tcm:TridionWebSchema>

So what's my point? Well I have a couple:

  1. The Internet is great (by which I mean, the people of the Internet are great.). I could have written that function myself in about half an hour. But in practice I might not have had the energy to do so at 10pm on a working Monday. Thanks to Keith's willingness to share, I had my solution inside a minute - working and tested. How cool is that?
  2. Somehow, this has just taken a little bit of the friction out of my working day, not to mention my so-called free time. I can now pull data straight out of a method that returns string, and get human-readable XML. This stuff makes a difference.

 

Thanks Keith - and all the other Keith's out there.

P.S. Maybe even nicely formatted XML will never be a thing of beauty, so apologies to Keats.

Why is it really slow to access Tridion via webdav?

Posted by Dominic Cronin at Jun 01, 2012 09:02 PM |
Filed under: , , ,

Today I wanted to upload 20 or so image files to my Tridion server. This is a bit of a faff to do through the normal user interface. (You'd have to create multimedia components one by one and then upload the binaries individually.) But no problem, because you can always use WebDAV, right? I wanted to upload the images from the server, which runs Windows 2008 Server R2. OK - so where are we now? Erm... Computer.... right-click ... Map network drive.... Pick a letter.... http://localhost/webdav/ ....OK! Boom... there we are - a nicely mapped webdav drive.

But.... it was awful. Like wading knee-deep through treacle with all the acrobats of the Chinese state circus balanced on your head. Slow? I could have made a cup of tea while it opened a folder.

So what was going on? My first instinct was that it probably wasn't Tridion to blame. Something like this, that more or less renders the feature unusable would have been flushed out during product testing, and fixed. So let's start by blaming Windows! (Millions of Apple fan-persons and Linux-inhaling Bill-haters can't all be wrong eh?) Oh enough of that. Suffice it to say that a quick google took me to Mark Lognoul's blog, where he describes the solution to this problem on Vista or Seven. Does it work on Server 2008? Yup - works like a charm. Thanks Mark. Job's a good'un.

Getting to grips with the Tridion core service in Powershell

Posted by Dominic Cronin at Apr 01, 2012 07:05 PM |

As regular readers of this blog will know, I've been a long-standing fan of the Windows Powershell as a tool for interacting with Tridion. On more than one project, the flexibility of the Powershell has allowed me to process Tridion data in ad-hoc ways that would be unthinkable if you had to bring with you all the overhead of, say, C# and Visual Studio. All of that is, of course, positive, but the downside of it has been that I don't seem to be making the jump over to the core service, which, after all, I should expect to be one of my primary APIs for some time to come. So time to make a change.

A while ago, I had tinkered with using the TOM.NET API from the Powershell, but I stopped putting effort into that once I got the basics working. The advice from SDL is clearly to use the core service for the kind of scenarios that the Powershell covers. Just for the record, though - getting a TOM.NET session in the Powershell is considerably more difficult than the equivalent activity using the core service. To be fair, neither technique even remotely approaches the simplicity of "$tdse = new-object -com TDS.TDSE", but like I said, it's time to move on.

So once I started looking at this, I had a quick look at Frank van Puffelen's GetCoreServiceClientWithoutConfigFile recipe in the Tridion Cookbook, and then I spent some time snuffling around in Peter Kjaer's Tridion Powershell Modules. Both of these are great resources, but I suffer quite badly from Not Invented Here syndrome, so at the very least, I had to poke around a bit and see what's going on. After some blatant stealing: mostly from Peter's code, I ended up with this:

Add-Type -assemblyName System.ServiceModel
$binding = new-object System.ServiceModel.WsHttpBinding
$binding.MaxBufferPoolSize = [int]::MaxValue
$binding.MaxReceivedMessageSize = [int]::MaxValue
$binding.ReaderQuotas.MaxArrayLength = [int]::MaxValue
$binding.ReaderQuotas.MaxBytesPerRead = [int]::MaxValue
$binding.ReaderQuotas.MaxNameTableCharCount = [int]::MaxValue
$binding.ReaderQuotas.MaxStringContentLength = [int]::MaxValue
$endpoint = new-object System.ServiceModel.EndpointAddress http://localhost/webservices/CoreService2011.svc/wsHttp
Add-Type -Path 'C:\Program Files (x86)\Tridion\bin\client\Tridion.ContentManager.CoreService.Client.dll'
$core = new-object Tridion.ContentManager.CoreService.Client.SessionAwareCoreServiceClient $binding,$endpoint

So what's going on here? What I've extracted is pretty close to the barest minimum implementation I could get to. Maybe you could get it smaller if you weren't bothered by running up against the fairly low default quota values offered by the Windows Communication Framework. In fact, I'm quite unsure about my approach to the quotas. What I've done is effectively to say that quotas aren't helpful for my scenario, and set them all to the maximum possible. Does this make sense? Let me know what you think. (Edit: I asked a question about this on stackoverflow, and got some good answers.)

So - to use the service you need three things:

  1. The core service client assembly. (It's great that SDL are now shipping this with the product. This means I can mail you a script, and say "use the 'official' client", and expect it to work.)
  2. A System.ServiceModel.WsHttpBinding object
  3. A System.ServiceModel.EndpointAddress object

So we load the System.ServiceModel assembly using Powershell's Add-Type cmdlet. This assembly is part of the .NET framework so we can just ask for it by name. Later in the script we use the same cmdlet to load the client dll, but then we have to specify its location. Once we have System.ServiceModel loaded, we can instantiate a binding and an endpoint, and pass those to the constructor of the client. Even though we end up with a few lines of code, it's not really hard, eh?

From here on we can just use the $core object to talk to the service. To be honest, having had a bit of a dig into how it works, you're probably better off just using Peter's module, which takes care of more than my hard-coded version does, and also offers some utility methods, for example, to create a new user. In fact, assuming you have installed the module, getting started is even easier than instantiating a TDSE: just "$core = Get-TridionCoreServiceClient". Nice job, Peter. Thanks.

Edit: If you prefer a NetTcp binding, this is pretty simple too: Just instantiate the correct binding type:

$binding = new-object System.ServiceModel.NetTcpBinding

And use a different endpoint

$endpoint = new-object System.ServiceModel.EndpointAddress net.tcp://localhost:2660/CoreService/2011/netTcp

What should we hope for Tridion workflow?

Posted by Dominic Cronin at Mar 18, 2012 07:45 PM |
Filed under: ,

Over at Tridion Developer, Chris Summers posts today about the forthcoming Tridion Bundles feature, and how it may be bringing a revival to the fortunes of Tridion workflow. He says: "Now given the sparkle I have seen in people’s eyes whenever they say 'Bundles…', I am pretty confident that something great is coming in the next major release, and I thought it might be time resurrect the workflow debate in the community – if we get vocal about these things now, we may see some of our dreams sneaking in with the bundles."

Picking up on this thought, I've set out to attempt to articulate what my concerns with the existing implementation have been and what I hope for in any new approach. I don't know very much about the Bundles feature - it's all pretty much still under wraps over at Tridion HQ, so I'm not really going to cast my criticisms and suggestions in terms of Bundles. Maybe the Bundles team have already considered all these concerns, and their design covers the same ground in an entirely different way. I'm just going to put it in terms of how things are now, and you can extrapolate.

So what's wrong with workflow as it now stands? Well you'll get different answers depending on who you talk to. If you ask different people what Workflow features in a product like Tridion should offer, you'll generally get answers that fall roughly in to one of these categories:

  1. Something intended to enforce governance requirements, where some level of process automation is a necessary evil
  2. Something which is intended primarily to support process automation, and within which enforcing governance rules is one of several possible applications

On more than one occasion, I've heard criticisms of Tridion from people whose prior experience had led them to expect the latter. My defence of Tridion would usually come out of the notion that Tridion's workflow support is in the first category, and was never intended to be the second. (OK - if pressed, I have to accept that the literal meaning of the word Workflow is far closer to the second than the first) Now right there, you've already got a huge debate. If Chris asks: "Why can’t Tridion notify me when I have something to review", then perhaps he's suggesting that some process automation might be a good thing. Whatever...  Chris doesn't need me to put words in his mouth, and I'm sure he'll make his position clear in his promised expansions on the subject. For myself, I don't have a real problem if Tridion doesn't support process automation "out of the box", as long as it's not too hard to wire it up if that's what you want. If SDL chose to offer workflow support as a third-party integration, I'd be fine with that, although I suspect that the amount of architectural changes required might be significant. As a web content management system intended for enterprise customers, there will always be a proportion of those who need to implement governance rules, and for them, I suspect the built-in support would need to do at least this much, as the current implementation does.

In terms of the existing approach, the biggest problem is probably that workflow controls access to versions by riding on the back of the check-in mechanism. The check-in mechanism in turn is designed to ensure that when you are working on something, other people can't see the change until you check it in. When an item "goes in to" workflow, the item is checked-out to the system workflow user, and remains checked out until it leaves workflow. Intermediate saves don't get a new version number, but a "revision", and these revisions are visible to the current workflow assignee if they use the workflow features to access the item. At first sight, this seems like a great plan, because the workflow sub-system doesn't have to do a thing to manage the visibility of the version of the item. If only it were so simple.

Well almost: there's a second mechanism - approval statuses - whereby workflow controls which version of an item can be used for publishing. If this worked as you might intuitively imagine it worked, then it would be great. By using workflow, you can assign an approval status to an item. If you see that approval status, you know that the item has been through a specific workflow activity. Approval statuses are in an ordered list, and you can specify a Minimum Approval Status on a publication target. The fly in the ointment is that Tridion only respects these approval statuses while the item remains in workflow.

This combination of workflow items being checked-out, and an approval status mechanism that doesn't work outside workflow, adds up to some pretty tricky problems to solve if, for example, you want to have workflow on both pages and components. Especially for the end users, it can be particularly disturbing when the edit you have just made to a component disappears when you view the same thing from the page.

The basic idea behind approval statuses is a good one. Specifically, the idea that you can associate data with a given version of an item, and that this allows you to evaluate which version of an item ought to be used for a particular activity, or even more generally to drive any business rule. The key thing here is that the only way you can write that data is to ensure that the item goes through a specific workflow activity. (Apart from that last condition, doesn't this remind you of Application Data?) If this "workflow data" were to include some kind of visibility constraints, then maybe you could do away with the dependency on the check-out mechanism. If nothing else, removing that dependency would probably make it much more feasible to integrate with third-party tools.

OK - as I've said, I'm suggesting a radical re-architecture, and I'm quite sure that such things take far more consideration than it has taken me to knock out a quick blog article, As I understand it, the Bundles thinking is already fairly far developed, and as I've said, maybe they've taken a completely different view of things. Still - I think Chris is right to suggest that now is the time for us to be vocal about what we want. What if the workflow experience were smooth enough that we could consider using it for process automation? As things stand, we mostly end up advising people that workflow is a measure only to be used in cases of necessity. There is so much scope for a positive change here, that I'm really looking forward to the next generation of Tridion workflow, whatever that ends up looking like.

MSSQL TCP Dynamic ports and Tridion Content Delivery

Posted by Dominic Cronin at Feb 26, 2012 06:20 PM |
Filed under: , ,

Recently, I was setting up Tridion content delivery on my development server. This runs as a VMWare image on my laptop, while the MSSQL database runs without virtualisation on the same laptop. If you read this earlier post, you will know that I like to have a script to check all my settings in advance (the script uses the standard .NET data access classes). I had done this, and everything was fine, so I was mildly surprised to find that as soon as I tried to publish anything I got error messages about being unable to connect to the database. The relevant part of my storage configuration looked like this:

<Storage Id="brokerdb" Type="persistence" dialect="MSSQL" Class="com.tridion.storage.persistence.JPADAOFactory">
  <Pool Type="jdbc" Size="5" MonitorInterval="60" IdleTimeout="120" CheckoutTimeout="120" />
  <DataSource Class="com.microsoft.sqlserver.jdbc.SQLServerDataSource">
    <Property Name="serverName" Value="WSL117\DEVELOPER" />
    <Property Name="portNumber" Value="1433" />
    <Property Name="databaseName" Value="Tridion_Broker" />
    <Property Name="user" Value="TridionBrokerUser" />
    <Property Name="password" Value="topsecret" />
  </DataSource>
</Storage>

OK - so all those settings were just the same as in my test script, except that the test script didn't specify the port number, but that's the default port, so nothing to see there, eh?

Anyway - the message was clear, it was something to do with the connection, so I went to check that MSSQL was listening on the expected port with a quick "netstat -oan". Lo - and behold, it was nowhere to be seen. Eventually I discovered that in the Sql Server Configuration Manager you can configure the port, and that there there's a setting called "TCP Dynamic Ports", which was switched on.

At this point, I could simply have configured a static port number and moved on, but I was intrigued. If MSSQL wasn't listening on a static port, how did my test script succeed? OK - it didn't seem too unreasonable that whatever mechanism was in play should be understood by the .NET framework, but could I get it to work from a Java-based system. Well after a bit of Googling, it turned out that there's a service called the SQL Server browser, which lets the client know what port it needs to connect to. Not only that, but it seems the Microsoft JDBC driver, which I was using, also supports this mechanism. I commented out the Property element that specifies the port number, restarted IIS (this was an HTTP Upload site) and sure enough, when I tested it again, everything worked great.

All this: the sweet smell of success, and yet somehow I was still troubled. Why on earth would Microsoft have introduced this dynamic mechanism? After all, it just means more configuration. Extra stuff to tweak. Extra stuff to go wrong. So why? It turns out that the answer is pretty straightforward. This quote from the SQL Server Help explains it all:

Prior to SQL Server 2000, only one instance of SQL Server could be installed on a computer. SQL Server listened for incoming requests on port 1433, assigned to SQL Server by the official Internet Assigned Numbers Authority (IANA). Only one instance of SQL Server can use a port, so when SQL Server 2000 introduced support for multiple instances of SQL Server, SQL Server Resolution Protocol (SSRP) was developed to listen on UDP port 1434. This listener service responded to client requests with the names of the installed instances, and the ports or named pipes used by the instance. To resolve limitations of the SSRP system, SQL Server 2005 introduced the SQL Server Browser service as a replacement for SSRP.

I had installed a named instance of MSSQL  alongside the existing SQLEXPRESS instance, so perhaps I should have figured this out myself. Whatever - at least it explains why things are set up this way. I chatted with a colleague from Indivirtual's hosting partner Sentia, and he confirmed that for Tridion infrastructure jobs, one of the tasks they have to do is configure MSSQL to listen on static ports. For a dedicated server, of course, this is the obvious choice.

Still - for the kind of configuration I have for development and research, it's great that the dynamic ports feature works well with Tridion. Of course, that's not the same thing as being a supported configuration. As with any enterprise software vendor, SDL Tridion only generally support configurations they have tested. In order to ensure that this gets "on to their radar", I've created an "idea" on ideas.sdltridion.com. (If you have a login there, you can go and vote it up if you like!) Hopefully in future releases, this will become a tested and supported configuration.

Anyway - there was a time when I did far more infrastructure work than I've done lately. I guess it shows!

JRE/JDK 6u29 doesn't work with Tridion content delivery and MSSQL

Posted by Dominic Cronin at Feb 20, 2012 10:05 PM |
Filed under: , ,

I've just spent an excessive amount of time trying to get Content Delivery set up on my SDL Tridion 2011 SP1 image. I couldn't get the HttpUpload web page working, no matter how much I tweaked the various settings. Then I gave up on that and tried setting up a deployer service instead. Still no joy; it wouldn't start. Fortunately I was able to call on my friends in the Tridion community, and Nuno Linhares came up with the answer. It turns out that the specific update of Java I had installed somehow doesn't get along with some of the other pieces in the puzzle. According to Nuno, it had something to do with using MSSQL as well, but frankly, I was just pleased to find that when I uninstalled 6u29 and installed 6u30, everything worked fine. Thanks Nuno.

Batching components for the component synchronizer

Posted by Dominic Cronin at Feb 15, 2012 08:25 PM |

Although the Tridion power tools are currently undergoing a major overhaul to bring them in line with SDL Tridion 2011 and its Anguilla extensibility framework, many of us will be working on pre-2011 systems for some time to come. This being so, the "old" power tools remain a useful resource. Today I was working on a 2009 system, where we are busy with a data migration. This means using the component synchronizer. We've been using it for a few things, but today we wanted to remove some redundant fields from a schema, and then have that reflected in the data. The schema in question had about 8500 components based on it, and processing the entire list in one go was going to slow the whole team down. (We're doing this on a relatively under-powered image that can run quite slowly sometimes - it's easy to max it out!)

So what to do? The way of using the component synchronizer that I'm most familiar with is simply to select the schema and then process all it's components. However, the synchronizer also has an option to process a specific list of components; you have to paste a comma-separated list of TCM URIs into a text box. So then the question was how to get such lists with a reasonable amount of effort. Powershell to the rescue. Here's the approach:

 > $tdse = new-object -com TDS.TDSE
> $alg = $tdse.GetObject("tcm:12-1255-8",1)
> $f = $tdse.CreateListRowFilter()
> $f.SetCondition("ItemType", 16)
> $docAlgItems = [xml]$alg.Info.GetListUsingItems(1,$f)
> $AlgTcms = $docAlgItems.ListUsingItems.Item | %{$_.ID}
> $algTcms[0..999] -join "," | out-file first1000.txt
> notepad .\first1000.txt

As you can see, we start with instantiating TDSE and getting hold of the schema. To protect the innocent, let's pretend that this schema is called Algernon. $alg is the variable representing the schema.

So - after a quick where-used, we do the standard Powershell trick of casting the results to an XmlDocument and reading off the ID attributes. By this time, $AlgTcms contains an array of TCM URIs and it's almost trivial to dump the first thousand into a file by specifying an array range (obviously you can get the second thousand by saying $AlgTcms[1000..1999] and so on) and doing a -join

So instead of maxing out the system for several hours, we were able to schedule the work in reasonable batches through the day.

Smoke-testing my Tridion database connections

Posted by Dominic Cronin at Jan 27, 2012 11:15 PM |
Filed under: , ,

I'm installing and configuring Tridion 2011 SP1. There are now so many databases, that it's just insane to try to keep track of them all by hand, but nil desperandum, the power shell is here. OK - you might not be quite so compulsive/obsessive, but I threw together a script that lets me have a list of verified working logins before I start poking at config files. At the very least, it brings out some findings about consistency across the different products. Here's what I did:

function CheckDatabase($connStringBuilder, $queryString="select DB_VERSION from TDS_DB_INFO", $CommandType="Text"){
  $conn = new-object System.Data.SqlClient.SqlConnection
  $conn.ConnectionString = $connStringBuilder.ConnectionString

  $conn.Open()

  $comm = new-object System.Data.SqlClient.SqlCommand
  $comm.CommandText = $queryString
  $comm.CommandType = $CommandType
  $comm.Connection = $conn
  $reader = $comm.ExecuteReader() 
  $readResult = $reader.Read() 
  $dbversion = $reader.GetString(0)
  $reader.Close()
  $Conn.Close()
  $dbName = $connStringBuilder["Initial Catalog"]
  if ($dbversion.length -gt 0) {"$dbname Database version found: $dbversion"} else {"$dbname fffft"}
}


$connStringBuilder = new-object System.Data.SqlClient.SqlConnectionStringBuilder
$connStringBuilder["Data Source"] = "MY_LAPTOP\DEVELOPER"
$connStringBuilder["Initial Catalog"] = "Tridion_cm"
$connStringBuilder["User ID"] = "TCMDBUSER"
$connStringBuilder["Password"] = "Yes I used the same password for all dbs - don't you?"

CheckDatabase $connStringBuilder


$connStringBuilder["Initial Catalog"] = "Tridion_Broker"
$connStringBuilder["User ID"] = "TridionBrokerUser"

CheckDatabase $connStringBuilder

$connStringBuilder["Initial Catalog"] = "Tridion_cm_email"
$connStringBuilder["User ID"] = "TMSDBUSER"

CheckDatabase $connStringBuilder "select DB_VERSION from OE_DB_INFO"

$connStringBuilder["Initial Catalog"] = "Tridion_submgmt"
$connStringBuilder["User ID"] = "TMSSMUSER"

CheckDatabase $connStringBuilder "select DB_VERSION from DB_INFO"

$connStringBuilder["Initial Catalog"] = "Tridion_tracking"
$connStringBuilder["User ID"] = "TMSPSUSER"

CheckDatabase $connStringBuilder "PS_READ_DBINFO" "StoredProcedure"

$connStringBuilder["Initial Catalog"] = "Tridion_TranslationManager" 
$connStringBuilder["User ID"] = "TMUser"

CheckDatabase $connStringBuilder "SELECT DB_VERSION FROM TM_DB_INFO"

$connStringBuilder["Initial Catalog"] = "Tridion_Ugc"
$connStringBuilder["User ID"] = "TridionUgcUser"

CheckDatabase $connStringBuilder "SELECT DB_VERSION FROM UGC_TDS_DB_INFO"

And here's what the output looked like:

 . C:\Users\Administrator\Desktop\dbTest.ps1
Tridion_cm Database version found: 6.1.0.0
Tridion_Broker Database version found: 6.1.0.0
Tridion_cm_email Database version found: 2.2.0.0
Tridion_submgmt Database version found: 2.2.0.0
Tridion_tracking Database version found: 2.2.0.0
Tridion_TranslationManager Database version found: 3.0.0.0
Tridion_Ugc Database version found: 6.1.0.0

All in all - maybe not worth the effort, but somehow satisfying. Is it useful? Maybe.

Using Ghostscript to reduce the size of a PDF

Posted by Dominic Cronin at Dec 18, 2011 01:10 PM |
Filed under: , ,

I had scanned in a document with the intention of emailing it. (For this I usually use PDFCreator which allows you to aggregate the results of several scans into a single PDF.) On this occasion, I had scanned all four pages of the document before realising that with, my current scanner settings, the resulting document would be about 12MB. So I was faced with the choice of either scanning them all again, or finding a way to reduce the size of the PDF. A quick Google turned up this link, which gave the following command line to use with Ghostscript:

gswin32c -sDEVICE=pdfwrite -dNOPAUSE -dBATCH 
-dPDFSETTINGS=/ebook -sOutputFile=C:newFile.pdf C:originalFile.pdf

The reason I had Googled for a Ghostscript solution was that I already knew I had it installed as part of Cygwin. (I always install Cygwin on any Windows machine I need to use regularly - mostly for the SSH client, but I usually do a full install just so that all those useful utilities are just there. After a bit of poking, I realised that instead of typing "gswin32c" I just needed "gs". The rest of the command worked just fine, and I ended up with a PDF of somewhat less than 2MB.

So here's a hat tip to the Ghostscript contributers over the years. Thanks. Isn't free software great?