Skip to content. | Skip to navigation

Personal tools

Navigation

You are here: Home / weblog

Dominic Cronin's weblog

Showing blog entries tagged as: Windows

Getting IIS Express to run in a 64 bit process, and other fun Tridion content delivery configurations

Posted by Dominic Cronin at Jul 24, 2013 07:55 PM |

In the last couple of days, I've spent far more time than I'd like figuring out how to get a Tridion-based web application to run correctly under Visual Studio. There are three basic choices:

  1. Run it directly using Visual Studio
  2. Run it using IIS Express
  3. Run it using IIS (non-Express version)

As the application is intended to run on a 64 bit architecture, there are some challenges. Visual Studio runs in 32 bit mode, so the first option is out. Using full-on IIS is an attractive thought; you can manually configure the application pool to run in 64 bit mode. Unfortunately, getting a debug session up and running takes more configuration than that. You have to set up the web site correctly, and it was just too fiddly. I ran out of time, or steam or whatever. (Somebody will probably tell me it's easy, and I dare say it is when you know how, and aren't spending time you really should be spending on something else. Any hints are always welcome.)

Of course, with a Tridion site, half the game is making sure you have the correct DLLs in place for the processor architecture you are using. Along the way, I discovered that the quick and dirty way to tell if you have a 32 or 64 bit version of xmogrt.dll (Juggernet's "native" layer) is the size. The 64 bit version comes in at 1600KB and the 32 bit version is about half that at 800KB or so. This varies from version to version, so on a 2013 system, it's 1200ish/900ish KB, but once you get the hang of it, you can tell them apart at sight, which is pretty useful.  The other DLLs are also important, although as far as I can tell, only Tridion.ContentDelivery.AmbientData.dll is hard-compiled for 64 bit architecture, at least on the 2011 system I was working on. The rest of the .NET assemblies are compiled to MSIL, which of course, will run on either architecture.

But I digress. The thing I wanted to blog (and this will definitely be tagged note-to-self) was how to get IIS Express to run in 64 bit mode. By default it runs on 32 bits, but if you follow this link:

http://visualstudio.uservoice.com/forums/121579-visual-studio/suggestions/3254745-allow-for-iis-express-64-bit-to-run-from-visual-st

... you will find the following nugget of goodness:

You can configure Visual Studio 2012 to use IIS Express 64-bit by setting the following registry key:

reg add HKEY_CURRENT_USER\Software\Microsoft\VisualStudio\11.0\WebProjects /v Use64BitIISExpress /t REG_DWORD /d 1

However, this feature is not supported and has not been fully tested by Microsoft. Improved support for IIS Express 64-bit is under consideration for the next release of Visual Studio.

Very handy indeed. Running under IIS express is just one click of the button. Just works.

And by way of a PS. (Post Script that is, not PowerShell) here's how you find the processor architecture of a DLL (This time on my 2013 image.)

PS C:\inetpub\www.visitorsweb.local\bin> [reflection.assemblyname]::GetAssemblyName((resolve-path '.\Tridion.ContentDelivery.AmbientData.dll')).ProcessorArchitecture
MSIL

Well anyway - it's no fun scratching your head over stuff like this. Maybe this helps.

Debugging 64 bit Tridion content delivery on IIS 7.5

I'm currently developing a web application which will run on Windows 2008 R2 and which is intended to run in a 64bit Application pool. This means that I'm running IIS 7.5, and that the web application is installed with the 64 bit versions of the Tridion content delivery assemblies. As you'll know if you've tried to run this kind of web application in a 32 bit process, you pretty soon get exceptions telling you that you have an invalid format. This gets a little inconvenient if you just start to debug your web application in Visual Studio. By default, if you have a page selected, and hit the big green Run triangle, the page will launch in IIS Express. If you have IIS 7.5, then IIS Express runs a 32 bit process, so the default setup just isn't going to work for you.

So - what to do? I had two options:

  1. Configure the properties of the web application to debug using IIS rather than IIS Express
  2. Launch the web page directly from the browser, and attach the debugger to the correct w3wp.exe process.

 

To be honest, the second of these was the choice that most matched my usual debugging approaches. Having said that, I did try the first approach, but so far without success. Visual Studio 2012 has frozen on me a few times while trying this. I'm interested if anyone has any tips on getting this working, but right now, I'm happy enough that I was able to succeed in attaching a debugger to w3wp.exe.

My biggest challenge was to figure out which process I wanted to attach to. On my development server, I have quite a few web sites running, and it's not altogether obvious which w3wp.exe to attach to. Attaching to them all might work in a trivial case, but realistically, it takes quite a while to load all the dlls, and adding any more processes than necessary is just going to hurt too much. So - how do you find out which process it is?

The first step is to ensure you have the IIS powershell provider installed on your server. These days, this is shipped as a module, so if it's available on your system, you should be able to open a powershell and type:

Get-Module -ListAvailable

If the response includes "WebAdministration" you are good to go. Just import the module as follows:

Import-Module WebAdministration

If this succeeds, you should be able to "change directory" into the IIS provider. (Although a PowerShell purist might prefer set-location... whatever floats your boat!)

cd IIS: 

If you can't find the module, then go into the Server manager, and check that you have the relevant role services for IIS installed. On other platforms, you might find that you can install it from the WebInstaller from the MSDN web site.

Now you're ready to find the process that you want to attach to: Assuming that your application pool is called "MyApplicationPool", then you can list its worker processes like this: (or use "dir" or "ls", either of which is an alias for "gci")

> gci IIS:\AppPools\MyApplicationPool\WorkerProcesses
Your output should look something like this:
Process  State      Handles  Start Time
Id
-------- -----      -------  ----------
2608     Running    776      1/2/2013 6:55:33 PM

This assumes, of course, that your app pool is actually running, but you'd have made sure it was before trying to debug it, right. Anyway - as you can see, the process id is there just to read off, and you can get straight on with your debugging session.

Why is it really slow to access Tridion via webdav?

Posted by Dominic Cronin at Jun 01, 2012 07:02 PM |
Filed under: , , ,

Today I wanted to upload 20 or so image files to my Tridion server. This is a bit of a faff to do through the normal user interface. (You'd have to create multimedia components one by one and then upload the binaries individually.) But no problem, because you can always use WebDAV, right? I wanted to upload the images from the server, which runs Windows 2008 Server R2. OK - so where are we now? Erm... Computer.... right-click ... Map network drive.... Pick a letter.... http://localhost/webdav/ ....OK! Boom... there we are - a nicely mapped webdav drive.

But.... it was awful. Like wading knee-deep through treacle with all the acrobats of the Chinese state circus balanced on your head. Slow? I could have made a cup of tea while it opened a folder.

So what was going on? My first instinct was that it probably wasn't Tridion to blame. Something like this, that more or less renders the feature unusable would have been flushed out during product testing, and fixed. So let's start by blaming Windows! (Millions of Apple fan-persons and Linux-inhaling Bill-haters can't all be wrong eh?) Oh enough of that. Suffice it to say that a quick google took me to Mark Lognoul's blog, where he describes the solution to this problem on Vista or Seven. Does it work on Server 2008? Yup - works like a charm. Thanks Mark. Job's a good'un.

Extending the boot partition of Windows 2003 Server

Posted by Dominic Cronin at Apr 04, 2010 06:25 PM |
Filed under: ,

In my work I quite commonly have servers on which over time I end up wanting to install more software, or simply upgrading what's there already, but in any case, consuming more disk space than originally envisaged by the person who set up the server. Today I was upgrading a Tridion development image to Tridion 2009 SP1. This was on a VMWare image whose C: drive was maxed out at 16GB. A while ago, I'd filled up the 16GB, and then gone though the rigmarole of trying to make it larger, and failed. At the time, it was more expedient to just add another disk and move some of the data off the C: drive. This approach only gets you so far, and sooner or later, you need a bigger C: drive.

So I'd already got as far as using the VMWare utilities to increase the size of the "physical" disk to 20GB, and then I'd booted the image from a GParted "live CD" .iso. and increased the size of the partition, also to 20GB. The problem was that although in the disk management snap-in you could see the full 20 GB, as far as Windows Explorer was concerned, there were only 16GB (and a pretty full 16GB at that!)

If you've ever been through this, you'll know that it can be about getting the magical incantations just right. The operating systems' own tools won't let you expand the boot partition, or a system partition, or the partition where the page file is or a whole bunch of other strange restrictions. (Yes - strange - even in 2003!) The reason for using gparted in the first place was that you definitely can't do it while Windows is running off the offending partition, and at some time in the past, I'd followed this approach, and it had just worked. Why not now? I don't know.

To cut a long story short, it just sat there staring at me, and wouldn't do what I wanted, when for no really good reason, I booted the system from gparted again, and this time used the "check and repair' utility. It duly checked and repaired, and I rebooted the system normally again, only to see that Windows had now decided that the disk was suspect and wanted to run CHKDSK. I let it run, and hey-presto, when the system came up, Windows Explorer could see the full 20GB. Job's a good'un, eh?

So while I don't have any solid explanation for it all, but in the hope that it helps someone - perhaps myself on some future occasion, I'm adding a blog entry. I don't know whether it was actually something that the gparted check/repair did that fixed the problem. In terms of probabilities, I'm leaning more in the direction that it was CHKDSK that did the actual fixing. If anyone is in the same boat, I'd be interested to know whether just running CHKDSK is sufficient.