Skip to content. | Skip to navigation

Personal tools

Navigation

You are here: Home / weblog

Dominic Cronin's weblog

Mysterious 404 errors showing up in the Tridion message centre

Posted by Dominic Cronin at Dec 20, 2012 12:37 AM |

Today I spent some time setting up a Tridion 2011 Content Manager server. In fact, the content manager had already been installed and had been working fine. Then we'd installed Microsoft Search Server. OK - so it's quite unusual to be doing quite so much all on one server, but this is a customer with minimal needs. Not everyone has 200 servers in the rack! Although Search Server is packaged as a product in it's own right, it's built on Sharepoint, and when you install it, it seems to bring two thirds of Sharepoint with it, including 2 MSSQL instances and three web sites. So to get the benefit of Microsoft's "free" search services, we'll probably have to configure another couple of gigs of RAM. (SFX: Sound of a cash register going "ca-ching" at VMWare headquarters)

Anyway to be fair, the search solution looks pretty good and it definitely does what it says on the box, although it's got about a hundred configuration screens (I haven't actually counted them, though). Well anyway - we'd installed this beast on our previously working Tridion server, and most things were going OK. Until I did an IISRESET, and then suddenly the Tridion CME started to complain about a 404 problem. So when you started the CME, you'd get error messages like:

The remote server returned an error (404) not found

On examining the message centre, I found this message 6 times, along with "Loading list of languages failed" and "Loading list of locales failed". Sure enough, the relevant drop-downs in the User preferences are not  populated.

When I F12'd the browser. (Is there a verb, to F12? There should be.) I could see that the browser wasn't seeing any responses with HTTP status 404. So what was going on?

After digging a bit on the server, I found that there were entries in the web server log like this:

2012-12-19 12:59:41 ::1 POST /WebUI/Models/CME/Services/General.svc/GetListCustomPages - 80 BLAH\Administrator ::1 - 404 0 0 58
2012-12-19 12:59:41 ::1 POST /WebUI/Models/CME/Services/General.svc/GetListFavorites - 80 BLAH\Administrator ::1 - 404 0 0 62
2012-12-19 12:59:41 ::1 POST /WebUI/Models/CME/Services/General.svc/GetListSystemAdministration - 80 BLAH\Administrator ::1 - 404 0 0 15
2012-12-19 12:59:41 ::1 POST /WebUI/Models/TCM54/Services/Lists.svc/GetList - 80 BLAH\Administrator ::1 - 404 0 0 30
2012-12-19 12:59:41 ::1 POST /WebUI/Models/TCM54/Services/Lists.svc/GetListEnumerationValues - 80 BLAH\Administrator ::1 - 404 0 0 5
2012-12-19 12:59:41 ::1 POST /WebUI/Models/TCM54/Services/Lists.svc/GetListEnumerationValues - 80 BLAH\Administrator ::1 - 404 0 0 8

So I could see from here that the errors were taking place when the CME web application made a local call-back on the server to it's own service layer. A bit more poking around showed that the problem was displayed whenever the CME made a callback to a service.

So what was going on? (Did I ask that already?)

It turned out that installing large portions of Sharepoint had had the undesired effect that the Tridion CME web site no longer owned the default binding. We had a host header binding mapped in IIS, and you could reach this just fine, but since the install, traffic aimed at 'localhost' was going to the wrong web site. Actually, Tridion has got this covered, because in the WebRoot Web.Config there's a an app setting called "Tridion.WCF.RedirectTo". This was pointing to localhost (which had worked fine when the server was first intalled). So when the CME tried to make calls back to the Model services, it was aiming these calls at localhost, which of course, ended up in the sharepoint site and a 404.

We fixed the immediate problem by editing the IIS bindings, but we're considering whether it might be good practice to always configure Tridion.WCF.RedirectTo to go to the name of your site, and not to localhost.

The relevant Tridion documentation is here,

Tridion Explorer reports System.ServiceModel.ServiceActivationException

Posted by Dominic Cronin at Dec 15, 2012 09:50 PM |

I'd been noticing strange messages popping up in the message centre of the SDL Tridion Explorer. The messages were about some service call failing with a 500 status and System.ServiceModel.ServiceActivationException, and seemed to be coming from various service points under C:\Program Files (x86)\Tridion\web\WebUI\Models\TCM54\Services. Here's an example:

/WebUI/Core/Services/Communicator.svc/Invoke failed to execute. STATUS(500): System.ServiceModel.ServiceActivationException

Not all the time, just occasionally when I did certain things. The thing that got me irritated enough to do something about it was when I wanted to delete a list of old versions of some items, and the multiple items functionality was breaking, and throwing up these messages. I could delete them one item at a time, but not all together. I suspect you can get problems with other things too, looking at the list of services that are served the same way from Models\TCM54\Services, and I think I remember also having problems with publishing and where-used.

A bit of Googling pointed me in the right direction, and I ended up after a couple of false starts editing: C:\Program Files (x86)\Tridion\web\WebUI\WebRoot\Web.Config

What you need to do to fix the problem is to add some configuration to get WCF to behave properly. On my, now working, system, it looks like this, but YMMV.
<serviceHostingEnvironment>
  <baseAddressPrefixFilters>
    <add prefix="http://localhost/"/>
  </baseAddressPrefixFilters>
</serviceHostingEnvironment>
Actually - once you're poking around in the web.config file, it's pretty easy, because it turns out that Tridion have already included the relevant configuration, commented out.
It may be that it's also called out in the installation documentation, and that I've missed it. Anyway - joining up the dots between the symptoms and this particular piece of config isn't so obvious, and it's always possible that you set up your system correctly and then add a new name binding in IIS, So therefore this "note to self" post, which will maybe help me to remember the extra step that's needed. And it can't hurt to have the cause and solution in close proximity in a Googleable location. :-)

Using helpers in Tridion Razor templating

Today, for the first time, I used a helper in a Razor Tridion template. I'd made a fairly standard 'generic link' embedded schema, so that I could combine the possibility of a component link and an external link in a link list, and allow for custom link text. (Nothing to see here, move along now please.)  However, when I came to template the output, I wanted to have a function that would process an individual link. A feature of Razor templating is that you can define a @helper, which is a bit like a function, except that instead of a return value, the body is an exemplar of the required output. There is also support for functions, so to lift Alex Klock's own example:

@functions {
    public string HelloWorld(string name) {
        return "Hello " + name;
    }
}

and

@helper HelloWorld(string name) {
    <div>Hello <em>@name</em>!</div>
}

will serve fairly similar purposes.

What I wanted to do today, however was slightly different; I didn't want to pass in a string, but a reference to my embedded field. All the examples on the web so far are about strings, and getting the types right proved interesting. I started out with some code like this:

@foreach(var link in @Fields.links){
  @RenderLink(link);
}

So I needed a helper called RenderLink (OK - this might be a very trivial use-case, but a real problem all the same.). But what was the type of the argument? In theory, "links" is an EmbeddedSchemaField (or to give it it's full Sunday name: Tridion.ContentManager.ContentManagement.Fields.EmbeddedSchemaField) but what you get in practice is an object of type "Tridion.Extensions.Mediators.Razor.Models.DynamicItemFields". I'd already guessed this by poking around in the Razor Mediator sources, but after a few of my first experiments went astray, I ended up confirming that with @link.GetType().FullName

Well I tried writing a helper like this:

@using Tridion.Extensions.Mediators.Razor.Models 
@helper RenderLink(DynamicItemFields link){
... implementation
}

but that didn't work, because when you try to call the methods on 'link' they don't exist.

And then, just for fun, of course, I tried

@using Tridion.ContentManager.ContentManagement.Fields 
@helper RenderLink(EmbeddedSchemaField link){
... implementation
}

but that was just going off in an even worse direction. Yeah, sure, that type would have had the methods, but what I actually had hold of was a DynamicItemFields. Eventually, I remembered some hints in the mediator's documentation and tried using the 'dynamic' keyword. This, it turns out, is what you need. The 'dynamic' type lets you invoke methods at run-time without the compiler needing to know about them. (At last, I was starting to understand some of the details of the mediator's implementation!)

@helper RenderLink(dynamic link){
... implementation
}

This may be obvious with hindsight (as the old engineers' joke has it ... for some value of 'obvious') . For now, I'm writing another blog post tagged #babysteps and #notetoself, and enjoying my tendency to take the road less travelled.

TWO roads diverged in a yellow wood,
And sorry I could not travel both
And be one traveler, long I stood

And looked down one as far as I could


To where it bent in the undergrowth;
Then took the other, as just as fair,
And having perhaps the better claim,

Because it was grassy and wanted wear;


Though as for that the passing there
Had worn them really about the same,
And both that morning equally lay

In leaves no step had trodden black.


Oh, I kept the first for another day!
Yet knowing how way leads on to way,
I doubted if I should ever come back.

I shall be telling this with a sigh


Somewhere ages and ages hence:
Two roads diverged in a wood, and I—
I took the one less traveled by,
And that has made all the difference.

-- Robert Frost

Enabling XML syntax-highlighting for .config files in gVim

Posted by Dominic Cronin at Nov 23, 2012 11:15 PM |

I've used the vi text editor for many years; (at least long enough to know that it's pronounced vie and not vee-eye!). Over those years my level of expertise has varied somewhat - I'm fairly sure I've learned some commands and forgotten them several times over. Anyway - recently (i.e. in the last year or so), I've put some more effort in to reacquainting myself with some of its many joys. In practice, of course, I really mean vim: I'd be hard-pressed to remember the last time I saw vi in its "good-old-fashioned" form (does one say Plain-old-vi?) As most of my work is on Windows systems, this means using gVim.

Of the many improvements that vim has over vi, syntax highlighting is one of my favourites. The trouble is, one of my commonest use-cases for editing text files on Windows systems is .NET configuration files. Because these have a file extension of .config, they aren't recognised by default as XML files, and I end up going through the rigmarole of selecting one menu option to get a choice of file types added to the menus, and then locating XML among those newly added options to get highlighting to come on. Well there had to be a better way, and of course there was. What you have to do is this:

  • Locate your vi directory (on the system I was working on this evening, it's "C:\Program Files (x86)\Vim\"
  • Having found this directory, locate or create C:\Program Files (x86)\Vim\vimfiles\ftdetect
  • In ftdetect, create a file called config.vim with the following contents:
au BufRead,BufNewFile *.config     set filetype=xml

I have Windows configured to use vi as the default editor for .config files, so now with this in place, all I have to do is double-click on the file and it opens with XML syntax-highlighting enabled. Great stuff!

Templating unbalanced tags in the Razor Mediator

Posted by Dominic Cronin at Nov 17, 2012 06:30 PM |

I've recently started using the Razor Mediator for Tridion (http://code.google.com/p/razor-mediator-4-tridion/) on a project, and it's been an interesting experience. To be honest, I wondered at first whether it would shift my views further in the direction of putting code in the templating layer, but I suspect I'll probably remain a die-hard token replacer. I did start at first with writing rather more C# in my templates than I generally would, but the reality is that the complexity always increases, and pretty soon you find yourself wanting to debug the code in Visual Studio. Then I'd rather have it in an assembly of its own. (OK - maybe there are, or could be, techniques for debugging your code in-place in the Razor template, but I'm not sure if the game would be worth the candle.)

Having said that, a few simple loops and if-blocks should be perfectly OK in the templating layer, which brings me to the subject of this post. My design has a page template which manages a list, in which the <li/> elements are created by a component template. The responsibility for the <ul/> belongs in the page template. (Yes - I know, but I've thought about it, and for what I'm trying to do, this is what makes the most sense.) So what about the scenario where they don't place any of the relevant component presentations on the page? Then I don't want the <ul> or the </ul> either. So I looked at the examples, and found how to do an if-block. How hard can it be, right? But this was where I hit another of my #babysteps learning points, which I'd like to share.

If you want to have an entire feature of your page appear or disappear based on a condition, you can simply write something like:

@if (someCondition) {
  <h1>The condition was met. Yeehah!</h1>
}

Straightforward enough: you can just put your desired html output in your block, and it appears or doesn't depending on the condition. And at this point I was in full-on how-hard-can-it-be hubris-mode, cruising for a bruising and headed for a fall. Ok - let's go:

@{
  var documents = @GetComponentPresentationsByTemplate("My Documents CT");
}
@if (documents.Count > 0) {
 <ul class="lookListy"> 
}
@foreach (var cp in documents) {
  @cp.RenderComponentPresentation()
}			          			}
@if (documents.Count > 0) {
 </ul> 
}

... or something similar. Looks reasonable, eh? (OK - maybe with a bit of practice I can get that tidier.) Except it's not. It doesn't compile, or more specific, the C# generated by Razor doesn't compile, and in Tridion, all you see is a nasty message about the wrong number of curly brackets or semicolons or some such. It doesn't really matter much what the error is, because the structure of your code is broken, and the thing it's reporting is further down, and somewhere in the generated code anyway.

Nota Bene: This level of error reporting is reason enough to avoid doing any complex logic in your template. Put it in a class, for goodness' sake!

So what was the problem? It turns out that to put HTML in-line in a Razor block, the tags need to balance, so you can say

<ul>.... <./ul>

, but not an opening

<ul>

without the closing tag.

This is not an issue with razor-mediator-4-tridion per se, but rather one with the way Razor itself works. Still - to do a successful Razor templating implementation in Tridion, you'll almost certainly need to know it. The solution is simple: you just need to wrap your unbalanced tags in a <text/> wrapper, as follows:

@{
  var documents = @GetComponentPresentationsByTemplate("My Documents CT");
}
@if (documents.Count > 0) {
 <text><ul class="lookListy"></text> 
}
@foreach (var cp in documents) {
  @cp.RenderComponentPresentation()
}			          			}
@if (documents.Count > 0) {
 <text></ul></text> 
}

This will now compile correctly, and produce the desired result.

Thanks to the contributors over at http://code.google.com/p/razor-mediator-4-tridion/. It's a great project, and I can see lots of potential for using it in my own work. Much as I'm a fan of XSLT for other uses, in templating its verbosity tends to make people push important code out of view, and well... Dreamweaver syntax ain't pretty either. :-)

EDIT: Thanks to a suggestion by Neil Gibbons (Thanks Neil!) I now realise that if you nest the foreach inside the if (which works for the logic I was trying to achieve), the <ul/> is now seen as 'balanced' and doesn't need the <text.> wrapper. So the problem is less severe than I had thought, but it's still one you need to be aware of.

@{
   var documents = GetComponentPresentationsByTemplate("My Documents CT"); 
   if (documents.Count > 0) {
       <ul class="lookListy">
	 @foreach (var cp in documents) {
	   @cp.RenderComponentPresentation();
       }
       </ul>
    }
  }

Toggling the javascript minification of the Tridion GUI from the powershell

Posted by Dominic Cronin at Nov 13, 2012 12:50 AM |
Filed under: , ,

Most of the time, I use a single Tridion development image for multiple purposes, including whatever time I get to spend researching how to do GUI extensions. When I'm flipping back out of research mode into doing some day-to-day development such as templating, it's better to have the benefit of the javascript minification that I might prefer to switch off while poking around in the guts of Anguilla. So just to make this switch as painless as possible, I've added the following code to my powershell $profile.

function SetGuiMinification($value){
  $filename = 'C:\Program Files (x86)\Tridion\web\WebUI\WebRoot\Configuration\System.config'
  $conf = [xml](gc $filename)
  $conf.Configuration.filters.filter |?{$_.type -like '*JScriptMinifier*'} |%{$_.enabled = $value}
  $conf.Save($filename)
  iisreset
}

function guimin {SetGuiMinification "always"}
function guinomin {SetGuiMinification "never"}

Now I can toggle backwards and forwards simply by typing guimin or guinomin (you may favour different words or spellings!)

Of course, this technique ought to work just as well to manipulate other elements and attributes in the XML files that control a Tridion installation. Perhaps you'd modify it to toggle the CSS minification too (removing the -like clause should do it).

If you have any good ideas for using this technique, please let me know.

Context Bag - a Tridion templating pattern

Posted by Dominic Cronin at Oct 30, 2012 01:25 AM |
Filed under: ,

When Tridion introduced compound templating (or modular templating if you prefer) in R5.3, one of the things that gradually became apparent was that in the new approach, the relationship between page renders and component renders was rather different. In VbScript, we'd been used to having a fairly simple way to pass parameters between the two. You could read and write parameters from and to a kind of global scope. This meant you could have your component templates influence the way the page templates worked, or have one component template influence the outcome of other component templates that were invoked further down the page. In modular templating, all this was over. You had a Context Variables dictionary available to you in both kinds of render, but the Context Variables dictionary you got in the component context was a new dictionary populated with the values from the original Context Variables of the page.

Of course, most of the time, this model works great. If you have a need to go beyond its limits, the first thing you should do is have a good look at your design and evaluate whether what you're trying to do is really smart. But still - there are rare cases where it can be really useful to pass state back up from the component to the page, make it available to other component renders, etc. Well the good news is, it is possible - you just have to add one more layer of redirection. I've been telling people for ages that I thought this would be possible; I'd just never got round to proving it in code. Well now I have, and I've written up how to implement this pattern over at Tridion Practice. I hope most of you will never need to do it, because it adds another level of complexity, and mostly there's a better way. Anyway - either I hope some small number of you will find it useful, or perhaps I'm just trying to establish prior art in case Apple decide to patent it.

How to list all the component templates associated with a schema

Posted by Dominic Cronin at Sep 26, 2012 08:45 PM |
Filed under: , ,

This posting might seem a little trivial, but having figured it out, I'm blogging it for my own reference. In fact, I was almost going to put it on the Tridion cookbook, but this is legacy stuff. There won't ever need to be a core service version of this, because in 2011, you can get the answer directly from Where Used.

But on older systems, say you wanted to update a schema, and wanted to figure out the impact on your templates. Which templates would you have to check for necessary updates, etc? (Imagine you were going to make a mandatory field optional, and wanted to check whether your templates would break if the user chose not to give a value.)

So you know which schema it is, and you want to know the component templates that have this as a related schema. I started to hack this out in Powershell using what are now for me fairly standard techniques. The trouble is that VBA collections are difficult to iterate over in the Powershell. Fortunately you can use the contains method on the RelatedSchemas collection to get the "where" clause you need. In most systems, you keep your templates, schemas etc, in a "system" folder, so the script simply starts at that location, and recursively grabs all the component templates it can find, If the schema of interest is in the related schemas, it will be listed.

$tdse = new-object -com TDS.TDSE
$interestingSchema = $tdse.Getobject("tcm:10-1234-8",1)
$systemFolder = $tdse.GetObject("tcm:11-123-2",1)
$rf = $tdse.CreateListRowFilter()
$rf.SetCondition("Recursive", $true)
$rf.SetCondition("ItemType", 32)
([xml]$systemFolder.GetListItems(3, $rf)).ListItems.Item | ?{$tdse.Getobject($_.ID,1).RelatedSchemas.Contains($interestingSchema)}

A poor man's Component synchroniser - or using the .NET framework to run XSLT from the PowerShell

Posted by Dominic Cronin at Aug 12, 2012 09:10 PM |

Just lately, I've been doing some work on porting the old Component Synchroniser power tool to the current version of Tridion. If you are familiar with the original implementation, you might know that it is based on a pretty advanced XSLT transformation (thankfully, that's not the part that needs porting), that pulls in data about the fields specified by the schema (including recursive evaluation of embedded schemas), and ensures that the component data is valid in terms of the schema. Quite often on an upgrade or migration project, any schema changes can be dealt with well enough by this approach, but sometimes you need to write a custom transformation to get your old component data to match the changes you've made in your schema. For example, the generic component synchroniser will remove any data that no longer has a field, but if you add a new field that needs to be populated on the basis of one of the old fields, you'll be reaching for your favourite XSLT editor and knocking up a migration transform.

This might sound like a lot of work, but very often, it isn't that painful. In any case, the XSLT itself is the hard part. The rest is just about having some boilerplate code to execute the transform. In the past, I've used various approaches, including quick-and-dirty console apps written in C#. As you probably know, in recent times, I've been a big fan of using the Windows Powershell to work with Tridion servers, and when I had to fix up some component migrations last week, of course, I looked to see whether it could be done with the PowerShell. A quick Google led me (as often happens!) to Scott Hanselman's site where he describes a technique using NXSLT2. Sadly, NXSLT2 now seems to be defunct, and anyway it struck me as perhaps inelegant, or at the least less PowerShell-ish to have to install another executable, when I already have the .NET framework,, with System.Xml.Xsl.XslCompiledTransform, available to me.

I've looked at doing XSLT transforms this way before, but there are so many overloads (of everything) that sometimes you end up being seduced by memory streams and 19 flavours of readers and writers. This time, I remembered System.IO.StringWriter, and the resulting execution of the transform took about four lines of code. The rest of what you see below is Tridion code that executes the transform against all the components based on a given schema. Sharp-eyed observers will note that in spite of a recent post here to the effect that I'm trying to wean myself from the TOM to the core service, this is TOM code. Yup - I was working on a Tridion 2009 server, so that was my only option. The good news is that the same PowerShell/XSLT technique will work just as well with the core service.

$tdse = new-object -com TDS.TDSE

$xslt = new-object System.Xml.XmlDocument
$xslt.Load("c:\Somewhere\TransformFooComponent.xslt")
$transform = new-object System.Xml.Xsl.XslCompiledTransform
$transform.Load($xslt)
$sb = new-object System.Text.StringBuilder
$writer = new-object System.IO.StringWriter $sb
filter FixFooComponent(){
$sb.Length = 0
$component = $tdse.GetObject($_, 2)
$xml = [xml]$component.GetXml(1919)
$transform.Transform($xml, $null, $writer)
$component.UpdateXml($sb.ToString())
$component.Save($true)
}
$schema = $tdse.GetObject("/webdav/SomePub/Building%20Blocks/System/Schemas/Foo.xsd",1)
([xml]$schema.Info.GetListUsingItems()).ListUsingItems.Item | ? {$_.Type -eq 16}| %{$_.ID} | FixFooComponent

Why should your Tridion GUI extension 'model' have it's own service layer on top of the core service?

Posted by Dominic Cronin at Aug 08, 2012 08:49 PM |

I've spent some time lately looking at the architecture for the next phase of implementing the Component Synchroniser for the Tridion Power Tools project. This meant looking through most of the other power tools, because, of course, they are a great resource for anyone wanting to build a Tridion GUI extension. The down side of this is sometimes, reading the code, you can observe a pattern being used, but it can be hard to tell why this would be a good or bad design. I'd noticed that the model of pretty much every power tool is implemented as a WCF service, often acting as a very thin wrapper around the core service client. As I was wondering about this, I posed the following question in the private chat channel used by the Tridion MVPs and community builders:

So if you're doing a gui extension, is it reckoned to be bad form to access the core service directly from your aspx. Or is it just coincidence that most (all?) of the power tools have an additional service layer?

This was enough to spark quite an informative debate, and in keeping with the spirit of the thing, I promised to write it up for general consumption. The contributors were Frank van Puffelen, Nuno Linhares, Peter Kjaer and Jeremy Grand-Scrutton.

The general feeling was that you ought to stick to the pattern I had observed in the power tools. The reasons were as follows:

  • Ease of coding - The Anguilla framework can automatically generate a JavaScript proxy for your service.
  • Maintainability - if you talk directly from JavaScript to the core service, you will not get any compile-time checks, whereas your own service layer would be built in .NET and would therefore have some defences against future (likely) changes in the core service client.
  • Consistency with the rest of the CME - In the CME, views are typically considered fully client-side. Where the CME does use Aspx, this is only to generate some HTML on the server, and typically not to for implementation logic.
  • Known issues -  ASP.NET postbacks in Anguilla views have been known to cause problems for some people, since e.g. popups won't keep their state through a postback (or an F5 press for that matter).

 

According to  these criteria, the actual design I was looking at could use the core directly, as my idea was to generate some HTML. In practice, it turns out that there are other reasons to stick with the extra service layer. Even so, I'm very glad I asked the question, and that the answers I got were so informative. Thanks guys!