Debugging those nasty Windows Azure startup-code problems with IntelliTrace

1. Introducing Intelli-Trace How do engineers figure out what caused a plane crash? One of the things they use is the black-box recording, which recorded all mayor data from the plane prior to the crash. This recording allows them to step back in time and analyze step-by-step what happened. Microsoft Visual Studio 2010 Ultimate also has a black-box for your code, called IntelliTrace. While your code is running, Intellitrace writes a log file (called an iTrace file), and you can analyze this using Visual Studio Ultimate. Windows Azure also allows you to enable Intellitrace on the server running your code, and this is ideal to figure out why your code is crashing on the server, especially early code (because you cannot attach the debugger). In the next part you’ll walkthrough this process. But first you need to download the AzureAndIntelliTrace solution. 2. Deploying the solution We’ll start by deploying the solution, so open the AzureAndIntelliTrace solution with Visual Studio. Right-click the Azure project and choose Publish… The Deploy Windows Azure project dialog should open. Make sure you check the “Enable IntelliTrace for .NET 4 roles” checkbox. Let’s have a look at settings, so click the “Settings…” link. Open the Modules tab: You can just leave everything to its default settings, but you could remove the StorageClient library if you would want to use intellitrace to track down a storage problem… Wait for the deployment until it says “Busy…”. If something goes wrong during the startup fase of your role instance with IntelliTrace enabled, Azure will keep the role busy so you can download the iTrace file. So you’re waiting for this: Then it is time to download the iTrace file. You can do that from the Server Explorer window. Open the Windows Azure Compute tree item until you reach the instance (note the Busy state!): The instance name will also mention whether or not it is IntelliTrace enabled. Now you can right click the instance to download the iTrace file: Wait for download to complete, the iTrace file should open automatically in Visual Studio: Scroll down until you reach the Exception Data section. You should see that you got a FileNotFoundException, caused because it couldn’t find the Dependencies assembly: 3. Fixing the dependency problem This kind of problem is easily solved, but first we need to stop this deployment. Go back to the Windows Azure Activity Log and right-click the deployment. Choose “Cancel and remove”. The problem is that we have a reference to the dependencies assembly, but when deployed to Azure it is not copied onto the instance. Go back to the solution and open the DebugThis project. Open the References folder and select the Dependencies assembly. In the properties window set the “Copy Local” property to true. Try redeploying again. Now we will have another problem, so wait for the instance to go Busy again… Download the iTrace file again. Scroll down to the exceptions section, select the FormatException and click the Start Debugging button. IntelliTrace will put you on the offending line. It is easy to see that the string.Format is missing an argument… You can start the debugger by clicking the “Set Debugger Context Here” button in the gutter. Now you can step back using intellitrace… As you can see, IntelliTrace is great for figuring out this kind of problem, especially if your code works in the compute emulator, but doesn’t on the real Azure server instance…

Building a Storage Account helper class (and forget about it)

When you use storage with the managed API’s, you always need to use a storage account, and make sure you setup the whole thing correctly. The way to do this is slightly different when building a web role versus a worker role, so I decided to tackle this problem by building a simple class that takes care of everything, and works the same in a web or worker role. All you need is to copy/reference this class in each of your projects. 1. The first problem – setting up the ConfigurationSettingPublisher The storage account uses a connection string. This connection string can come from the normal web.config (when you use storage in a local web site) or the Azure project’s ServiceConfiguration.cscfg. To allow all connections to be available in both environments the storage account uses a ConfigurationSettingPublisher, which is a delegate that retrieves the connection. Code Snippet CloudStorageAccount.SetConfigurationSettingPublisher(   (config, setter) =>   {     setter(       RoleEnvironment.IsAvailable ?         RoleEnvironment.GetConfigurationSettingValue(config)         :         ConfigurationManager.AppSettings[config]     );   });   Problem is where to put the code; in a web role you have to put the code in startup code, so you need to put this in Global.asax. In a worker role there is no such thing, so you put this in the worker role’s Start method. So, you might say, let’s also put this code in the Start method of the web role; this will run in another process so you end up without a proper initialized web role… So here is my solution: create a new class MyStorageAccount which uses a static constructor (also known as the type constructor) so call this code. A static constructor will automatically be called by the runtime when you first use the class. So this way you will always have correct initialization, and only when you need it: Code Snippet using System.Configuration; using Microsoft.WindowsAzure; using Microsoft.WindowsAzure.ServiceRuntime;   namespace MessagesLib {   public static class MyStorageAccount   {     public static string DataConnection = "DataConnection";       public static CloudStorageAccount Instance     {       get       {         return CloudStorageAccount.FromConfigurationSetting(DataConnection);       }     }       static MyStorageAccount()     {       CloudStorageAccount.SetConfigurationSettingPublisher(         (config, setter) =>         {           setter(             RoleEnvironment.IsAvailable ?               RoleEnvironment.GetConfigurationSettingValue(config)               :               ConfigurationManager.AppSettings[config]           );         });     }   } } This class uses the static constructor to setup the ConfigurationSettingPublisher, and also has an Instance property. This allows you to retrieve the storage account without always having to provide the connection configuration name. Most applications will only use a single connection, so this should work fine. So instead of writing this all the time: Code Snippet var account = CloudStorageAccount.FromConfigurationSetting("DataConnection"); var tableClient =   account.CreateCloudTableClient();   You can now write it like this: Code Snippet var account = MyStorageAccount.Instance; var tableClient =   account.CreateCloudTableClient();   Or even shorter: Code Snippet var tableClient =  MyStorageAccount.Instance.CreateCloudTableClient();   Much easier! 2. Handling configuration changes Next problem is when the configuration of the role is updated. In this case you need to check if you can handle this change, if not, you should restart the role. Typical code for this looks like this: Code Snippet RoleEnvironment.Changing += (_, changes) => {   if (changes.Changes              .OfType<RoleEnvironmentConfigurationSettingChange>()              .Any(change => change.ConfigurationSettingName == config))   {     if (!setter(RoleEnvironment.GetConfigurationSettingValue(config)))     {       RoleEnvironment.RequestRecycle();     }   } };   We register for the Changing event. When this event gets raised, you receive all the changes in a collection. You walk the collection to see if the configuration you’re interested in is part of the collection. In that case you need to re-invoke the ConfigurationSettingPublisher, and when that method returns false you need the recycle the role. Please note that this will only happen when running as a role (not local web site) so we can use the cloud version to retrieve configuration… So what is the result of this? No need to call special code in your Global.asax or worker role Start method. It just works!

Running multiple sites in one Windows Azure Web Role

Since the release of the Windows Azure SDK 1.3 it is possible to host multiple sites in one web role. In this blog post I will show you how to do this. 1. Creating the Azure project Start by creating a new Azure Cloud project. Add a single WebRole project (call it MultiSitesWebRole) to it: Hit Ok. Now we just want to make sure we see each different web site, so open the default.aspx page and change the header, for example: Code Snippet <%@ Page Title="Home Page" Language="C#" MasterPageFile="~/Site.master" AutoEventWireup="true"     CodeBehind="Default.aspx.cs" Inherits="WebApplication1._Default" %>   <asp:Content ID="HeaderContent" runat="server" ContentPlaceHolderID="HeadContent"> </asp:Content> <asp:Content ID="BodyContent" runat="server" ContentPlaceHolderID="MainContent">     <h2>         This is the main site!     </h2>     <p>         To learn more about ASP.NET visit <a href="http://www.asp.net" title="ASP.NET Website">www.asp.net</a>.     </p>     <p>         You can also find <a href="http://go.microsoft.com/fwlink/?LinkID=152368&amp;clcid=0x409"             title="MSDN ASP.NET Docs">documentation on ASP.NET at MSDN</a>.     </p> </asp:Content>   Make sure the cloud project is set as the start project, and then run (F5) your solution. Your web browser should open and display the site. While this is still running open IIS manager. Open the list of sites, you should see the site for this solution (the name will be different): The Windows Azure Compute emulator actually uses IIS to run your web role by creating a site on your local machine. Stop your debugging session. 2. Adding the second site Right-click your solution and add another ASP.NET web project.calling it “TheSecondWebSite”. Update the default.aspx again to show that this is the second site: Code Snippet <%@ Page Title="Home Page" Language="C#" MasterPageFile="~/Site.master" AutoEventWireup="true"     CodeBehind="Default.aspx.cs" Inherits="WebApplication2._Default" %>   <asp:Content ID="HeaderContent" runat="server" ContentPlaceHolderID="HeadContent"> </asp:Content> <asp:Content ID="BodyContent" runat="server" ContentPlaceHolderID="MainContent">     <h2>         The second web site!     </h2>     <p>         To learn more about ASP.NET visit <a href="http://www.asp.net" title="ASP.NET Website">www.asp.net</a>.     </p>     <p>         You can also find <a href="http://go.microsoft.com/fwlink/?LinkID=152368&amp;clcid=0x409"             title="MSDN ASP.NET Docs">documentation on ASP.NET at MSDN</a>.     </p> </asp:Content>   Now open the cloud service definition file (ServiceDefinition.csdef). Look at the <Sites> element and its children. Copy the <Site> element to create another one. Modify the site name and add a physicalDirectory element set to the path of the second site. Inside the <Site> element look for the <Binding> element and add another attribute hostHeader set to www.contoso.com. Ok, I’m using www.contoso.com for testing purposes, you can use your own site url if you want. We’ll be changing the host file on your machine so you can test everything, but this won’t work for other people. If you really want to make this work you will need to make the necessary DNS entries to redirect your site url to the cloud url… ServiceDefinition.csdef Code Snippet <Sites>   <!-- First site -->   <Site name="Check" physicalDirectory="C:\...\FullIIS_MultipleSites\WebApplication1">     <Bindings>       <Binding name="Endpoint1" endpointName="Endpoint1" />     </Bindings>   </Site>   <!-- Second site -->   <Site name="Encore" physicalDirectory="C:\...\FullIIS_MultipleSites\WebApplication2">     <Bindings>       <Binding name="Endpoint1" endpointName="Endpoint1" hostHeader="www.contoso.com" />     </Bindings>   </Site> </Sites>   Before you run your solution we need to make sure that the hostHeader www.contoso.com is pointing to the local environment. You can do this by editing the hosts file, which can be found in <Windows>\System32\drivers\etc\hosts. You can edit this file using notepad. Make sure you have the following in this file: 127.0.0.1 www.contoso.com Run your solution. Your browser should show the first site: Note the port used by the first site (in this screenshot above this is port 81, the compute emulator will take the first available port above port 80). Now you can open the second site by browsing to www.contoso.com:81 (use your own port here). Before you stop debugging, open the IIS manager again, you should now see two sites (you may need to refresh the sites node): 3. Creating a nested virtual application Add another web project, calling it TheThirdSite. Change default.aspx again to reflect that this is the third site. Open the service definition again, and add the VirtualApplication element inside the second site: Servicedefinition.csdef Code Snippet <Sites>   <!-- First site -->   <Site name="Check"         physicalDirectory="C:\...\FullIIS_MultipleSites\WebApplication1">     <Bindings>       <Binding name="Endpoint1" endpointName="Endpoint1" />     </Bindings>   </Site>   <!-- Second site -->   <Site name="Encore"         physicalDirectory="C:\...\FullIIS_MultipleSites\WebApplication2">     <!-- Even a nested Virtual application -->     <VirtualApplication name="More"                         physicalDirectory="C:\...\FullIIS_MultipleSites\MvcApplication1">     </VirtualApplication>     <Bindings>       <Binding name="Endpoint1" endpointName="Endpoint1" hostHeader="www.contoso.com" />     </Bindings>   </Site> </Sites>   Run again and browse to www.contoso.com:81/More. You should see the third site. 4 Deploying and testing in the cloud Let’s test your project in the cloud. Deploy your solution: Once deployment is ready click on the link to open the first site. Now open a command prompt and use the ping command to retrieve the site’s IP address. Open the hosts file and modify the www.contoso.com to use the new IP address. Browse to www.contoso.com (no need to specify a port number now). You should see the second site. If not, you might need to clear your DNS cache using the command: ipconfig /flushdns net stop dnscache net start dnscache Browse to the nested virtual application at www.contoso.com/More: That’s it. You’ve just created your own web role in the cloud, hosting three different sites! To complete everything you now should go to your internet provider to make the necessary DNS entries to redirect your site url to the cloud url!

Remote debugging a Windows Azure Worker Role using Azure Connect, Remote desktop and the remote debugger, part 3

This is the third part of the “Remote debugging a Windows Azure Worker Role using Azure Connect, Remote desktop and the remote debugger”. In this part I will show you how to attach the remote debugger on your worker role and start debugging… Step 3: Connect to the remove debugger In the previous blog post we ended up copying the remote debugger to the worker role instance machine. Open the remote debugger folder (the one you just copied) and start the remote debugger by double-clicking the msvsmon.exe application. The remote debugger should start: Take note of the server name. Minimize the remote desktop session. Open Visual Studio ( the one you used to publish the worker role project). Select Debug->Attach to process… The Attach to Process windows should open. Type in the server name you got when you started the remote debugger. Be patient while Visual Studio downloads the list of processes from the remote server… WARNING: if you get authentication errors this means you are not using the same username and password from your local machine. After a while you should get a list of all processes: Select the WaWorkerHost.exe process and click the Attach button. Step 4: debug! Now you can add a breakpoint in your code and wait for the worker role to hit it. Again you will have to be patient… Happy debugging!

Remote debugging an Azure Worker role, using Azure Connect, Remote desktop and the remote debugger

So, you have some code running in a Windows Azure Worker role on Azure (not on the Azure Compute Emulator) and you need to debug what is going on. In this blog post I will show you how you can use Azure Connect, the remote desktop and the remote debugger to look inside your worker role. WARNING: this process can be a strain on your patience, since the debugger will work very slowly! But this is the only way to look inside a running worker role using the debugger… Step 1: Install your worker role on Azure, enabling remote desktop and Azure Connect First we need to install the worker role in an Azure instance, not forgetting to enable remote desktop and Azure Connect. To install Azure Connect, right-click on the worker role and select Properties. The Worker Role configuration window will open. Select the Virtual Network tab: Check the “Activate Windows Azure Connect” checkbox and copy-paste your activation token in the textbox. To get the activation token, go to the Windows Azure Portal, select the Virtual Network tab and select your subscription (you may need to request access to Windows Azure Connect because it is still in CTP). Click on the Get Activation Token button: Copy the activation token and paste in the project’s properties. If you have never done so before, click on the Install Local Endpoint button to get the local software installed on your computer. To install Remove Desktop, you need to first install a certificate on your Hosted Service, containing both private and public key pairs. This is easy to do using Visual Studio: Right-click your Azure project and select Publish… The Deploy Windows Azure project dialog will open. Click on the Configure Remote Desktop connection link, which opens the Remote Desktop Configuration dialog: Select a certificate from the drop-down, or create a new certificate using the <Create…> entry (the last entry in the dropdown combobox). To export the certificate (so you can install it using the Azure Portal) click on the View… button. Click on the Details tab, then on the Copy to File… button. The Certificate Export wizard will open… Click Next. Select Yes, export the private key option. Press Next. Keep the default .pfx option, press Next again. Enter a password and press Next. Select a file name, press Next then Finish. Now we’re ready to import the certificate in the Azure portal. Go to your hosted services, select the service of your choice and then select the Certificate folder. Click on the Add Certificate button: Open the previously created .pfx file, enter your password and press Create. This should add the certificate to the hosted service. Go back to Visual Studio (the Remote Desktop Configuration dialog should still be waiting for you). Select the certificate, enter your username and password. WARNING: You want to use your real username and password, this way you can authenticate to the remote server from Visual Studio. Otherwise you will get authentication errors! Enter some appropriate expiration date and click Ok to start publishing. Wait for Visual Studio to complete… This is step 1. In the next post I will show you how to copy the remote debugger to your worker role instance…

Getting Started Developing on Azure: Creating the Hosted Service and Storage Account

This is the next post on Getting Started developing on Azure with Visual Studio 2010 and Installing the Azure Management Certificates. Starting to develop on Azure with Visual Studio can be a lot to eat the first time. Getting Visual Studio ready, including installing the Management Certificates and so on is not a simple task (the first time anyway). So that is why I made this little walk-through series on starting to develop on Azure… In the first part you’ll build and run your first project on the Azure Compute Emulator, which is the local test version of the cloud. In the second part you’ll get Visual Studio ready to directly publish your solution in the cloud by installing the management certificates and in this part you will create the Hosted Service and storage account to actually deploy from Visual Studio. 1.3 Creating the Hosted Service Go back to the Azure Management Portal (windows.azure.com), and now click on the Hosted Services tab. To deploy we also need a hosted service, so click the New Hosted Service button to create one. This opens the Create a New Hosted Service dialog: Enter a name for your service and URL prefix. The prefix should be world-wide unique, so try adding your company name or something unique to you in it. If the chosen name is already taken you will be notified like this: Now we need to choose in which data-center the service should be deployed. We can also do this using an affinity group. An affinity group is an easy way to keep everything together (services can communicate with one another more efficiently (and less cost) if they are running in the same data center) so choose affinity group and select Create a new affinity group… from the dropdown list. Enter a name and datacenter; best take one near to your expected customers: North Europe is the datacenter in Amsterdam, which is closest to where I live, so I took that one. Feel free to take another… Click OK. We’ll deploy using Visual Studio, so before you click OK, select Do not deploy. Now click OK. 1.1.5 Creating the Storage Account Finally, before we can deploy we also need to create a Storage Account. Visual Studio will upload the package to storage and then instruct Azure to deploy it from storage. Go back to the Management Portal (windows.azure.com)l. Now go the the Storage Accounts tab. Click on the New Storage Account button. This will open the Create a New Storage Account dialog: Enter a unique (lowercase only) URL and use the same affinity group you used in the previous step. Hit OK. 1.1.6 Ready to deploy the web site Now we are ready to deploy! First we need to remove the local development trace listener from web.config configuration because it is not available in the cloud, but leave the diagnostic monitor listener: Code Snippet <listeners>   <add type="Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener, Microsoft.WindowsAzure.Diagnostics, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35"     name="AzureDiagnostics">     <filter type="" />   </add> </listeners>   Go back to Visual Studio, right click your azure project, and select Publish… Now select your hosted service and storage account, and hit OK (you might need to cancel and re-open this window to refresh it). Deployment should start and after a little while you should see the Windows Azure Activity Log: Wait until this is complete, it may take several minutes or longer depending on bandwidth… Click on the WebSite URL, the site should open. 1.1.7 Using Server Explorer Open Server Explorer. With the Azure SDK tools installed, you can look here at Azure Compute and Azure Storage. Click on the Windows Azure Compute tree item, then select Add Deployment Environment… Select your management certificate to list all hosted services there. Open the tree item and select Staging or Production. Click OK. From now on you can look at this environment from Visual Studio Server Explorer: You can try the same for storage. Later we will look at how we can use this to debug applications with Intelli-Trace. The end. My Next post (next Wednesday) is going to be on remote debugging worker roles running in the cloud…

Installing the Azure Management Certificates

This is the next post on Getting Started developing on Azure with Visual Studio 2010. Starting to develop on Azure with Visual Studio can be a lot to eat the first time. Getting Visual Studio ready, including installing the Management Certificates and so on is not a simple task (the first time anyway). So that is why I made this little walk-through series on starting to develop on Azure… In the first part you’ll build and run your first project on the Azure Compute Emulator, which is the local test version of the cloud. In this part you’ll get Visual Studio ready to directly publish your solution in the cloud by installing the management certificates and in part 3 you will create the Hosted Service and storage account to actually deploy from Visual Studio. 1.2 Deploying your solution to the cloud Now we’re ready to deploy to the actual cloud environment. Start by logging on to the Azure Management Portal, which is at http://windows.azure.com . Open the Hosted Services tab, and select Management Certificates: If you just started with Azure development you will not have any certificate here. To make Visual Studio integrate better with the management portal (actually with the management API’s) we need to upload a certificate here so your Visual Studio can publish projects to the cloud. This is easily done with Visual Studio. 1.1.3 Installing the Management Certificate: Go back to Visual Studio with the cloud project open. Right-click the cloud project and select Publish… The Deploy Windows Azure project dialog should open: With this dialog you can publish your project. The first option will create the package, but then you have to deploy it using the Management Portal. The second option will do this for you. In the first drop-down you need to select a Management Certificate, or create a new one. Select Add… from the Credentials drop-down. The Windows Azure Project Management Authentication dialog opens: Select <Create…> from the first drop-down. Enter a friendly name for your certificate. I call mine AzureManagment. Now click the View button. This will allow use to export the certificate’s public key using the Details… tab: Click on the Copy to File button. This opens the Certificate Export wizard. Click next. Choose “Do not export the private key”: Click Next> twice, then choose a filename for the certificate. Hit Next> then Finish. Your certificate should be exported now. Go back to the management portal (windows.azure.com), to the Management Certificates. Click on the Add Certificate button and select your previously exported file. Click Ok and wait for the import to complete. Your certificate should be added, and now you need to copy the subscription-id back to Visual Studio. It is right there in the properties window of the certificate: So copy it, go to Visual Studio to where the Windows Azure Project Management Authentication is waiting. Copy the subscription-id and name it: Click OK. In the next blog post we will add a hosted service and storage account and deploy the solution…

Getting started developing on Azure with Visual Studio 2010

Starting to develop on Azure with Visual Studio can be a lot to eat the first time. Getting Visual Studio ready, including installing the Management Certificates and so on is not a simple task (the first time anyway). So that is why I made this little walk-through series on starting to develop on Azure… In this part you’ll build and run your first project on the Azure Compute Emulator, which is the local test version of the cloud. In part 2 you’ll get Visual Studio ready to directly publish your solution in the cloud by installing the management certificates and in part 3 you will create the Hosted Service and storage account to actually deploy from Visual Studio. 1.1 Azure Lab – Getting started developing In this walk-through you will learn how to develop with Visual Studio 2010 on Azure. To be able to do this you need to have the latest Azure SDK with Visual Studio tools installed (SDK 1.4 at time of writing), and you should also have a valid Azure account. 1.1.1 Creating the Azure project Start Visual Studio 2010 and create a new cloud project, calling it MyFirstCloudProject: Click on Ok. The New Windows Azure Project dialog should open: Click on the ASP.NET Web Role and click the > button. Then rename the project by clicking on the rename button: Call it MyFirstWebRole: Click OK. Add a button to the web form: Implement its click event as follows: Code Snippet System.Diagnostics.Trace.WriteLine("Hello from Azure!"); To enable Tracing to the Windows Compute Emulator, add following configuration to your web.config (line 6 to 9 should be added): Code Snippet <listeners> <add type="Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener, Microsoft.WindowsAzure.Diagnostics, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" name="AzureDiagnostics"> <filter type="" /> </add> <add type="Microsoft.ServiceHosting.Tools.DevelopmentFabric.Runtime.DevelopmentFabricTraceListener, Microsoft.ServiceHosting.Tools.DevelopmentFabric.Runtime, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" name="DevFabricListener"> <filter type="" > </filter> </add> </listeners> Open the Solution Explorer and make sure that the Azure project is the start project. Run your solution by pressing F5. Note that Visual Studio will package your code and upload it to the Compute Emulator. If not already running, the compute emulator (the little Azure icon in the system tray) will also be started. Right-click the Azure icon in the system tray and choose “Show Compute Emulator UI”. After a while your browser will display the web site and the emulator will display loggings: This window shows you output as your web role gets started, etc… You can add some more by clicking the button. This should write something to the emulator: Try placing a breakpoint on the button’s event handler. Click the button. Visual Studio should stop on the breakpoint. If your code was a little more complicated this would allow you to debug it. In the next blog we will look at installing the management certificates to run this same code in the cloud, deploying it with Visual Studio 2010.

Creating and Using Custom Performance Counters in Windows Azure

Building software, especially software running on servers, requires some way to look “inside” the running application. Using the debugger is one way, but you cannot use a debugger on production applications. A better way is to use performance counters. These give you a way to see things, like how hard the CPU working, but also how many orders have been processed by your system. The first performance counter is provided by the system, the latter you can build yourself. Before Azure SDK 1.3 you couldn’t create your own performance counters because your code doesn’t get write access to the region of the registry where you register your custom performance counters. But with elevated startup tasks this is easy. In the blog post I will show you how you can create a startup task to create custom performance counters, and how to use them in your role. Commence by creating a new Visual Studio Cloud project. Add a single worker role. We’ll use this role to illustrate using a performance counter. Add another project, now a Console project (call it InstallPerfCounters). We’ll use this console application as the startup task. Implement the InstallPerfCounters console project as follows: Code Snippet class Program {   static void Main(string[] args)   {     const string categoryName = "U2U";     const string categoryHelp = "U2U demo counters";       if (!PerformanceCounterCategory.Exists(categoryName))     {       var counters = new System.Diagnostics.CounterCreationDataCollection();       counters.Add(new CounterCreationData       {         CounterName = "# secs/running",         CounterHelp = "How long has this been running",         CounterType = PerformanceCounterType.NumberOfItems32       });         var category = PerformanceCounterCategory.Create(categoryName,         categoryHelp, PerformanceCounterCategoryType.MultiInstance, counters);     }   } } This uses the same kind of code you would use anywhere else to create new performance counters. Now we need to install this as a startup task. Add a folder to the worker role project, calling it startup. We need to add two files here, one the console project we just made, and a command file. To copy the executable, let’s make first sure we’re using the release version at all time. Open the configuration manager: Select Release as the configuration: Build your project, ensuring everything compiles nicely. Now right-click the startup folder from the worker role project and select Add Existing Item… Browse to the release folder of the console project, select the executable and choose Add As Link from the drop-down: This should add the executable to the startup folder. Select it, and select Copy Always in the properties folder: Now we are ready to add the command file. Don’t use Visual Studio for this, because it will add the Byte Order Mark which is not supported by Azure. The easiest way to do this is by right-clicking the startup folder, and select “Open Folder in Windows Explorer”. Then right-click the folder’s contents, and add a new text document: Rename it to installcmd.cmd. Go back to Visual Studio. In Solution Explorer select “Show All Files”: The installcmd.cmd should appear, you can now right-click it and select “Include in project”. Edit it to the following contents: Code Snippet %~dp0InstallPerfCounters.exe /q /log %~dp0pc_install.htm exit /b 0   Now open the ServiceDefinition.csdef file from your cloud project and add a startup task: Code Snippet <Startup>   <Task commandLine="startup\installCmd.cmd" executionContext="elevated" taskType="simple" /> </Startup>   This should take care of installing the performance counter. Now let’s use it in our worker role. First we need to create the performance counter instance, and then update it. In this simple example we’ll make the counter increment once each second. So implement the worker role’s run as follows: Code Snippet public override void Run() {   // This is a sample worker implementation. Replace with your logic.   Trace.TraceInformation("UsingPerfCounters entry point called");     const string categoryName = "U2U";     PerformanceCounter secsRunning = new PerformanceCounter()   {     CategoryName = categoryName,     CounterName = "# secs/running",     MachineName = "." /* current machine */,     InstanceName = Environment.MachineName,     ReadOnly = false   };   var counterExists = PerformanceCounterCategory.Exists(categoryName);     while (true)   {     Thread.Sleep(TimeSpan.FromSeconds(1));     if (counterExists)     {       secsRunning.Increment();     }     Trace.WriteLine("Working", "Information");   } }   Publish this solution in Azure, not forgetting to turn on Remove desktop. Also note that I turn in IntelliTrace, whichh is great for debugging those nasty deployment problems… When you complete publishing you can now remote desktop to the instance and use PerfMon to look at your custom performance counter. Or you can use Azure Diagnostics….

Silverlight and the Windows Azure AppFabric Service Bus

This blog post will show you how to allow a Silverlight application to call a service over the Windows Azure AppFabric Service Bus. The problem you need to solve is that Silverlight will look for a “clientaccesspolicy.xml” at the root uri of the service. When I tried it myself I couldn’t find any “how to” on this topic so I decided to turn this into a blog post. If anyone else has this blogged, sorry I am such a bad internet searcher . So, you’ve just build a nice Silverlight application that uses some WCF service you’re hosting locally. You’ve done all the steps to make it work on your machine, including the “clientaccesspolicy.xml” to enable cross-domain communication. The only thing is that you want to keep hosting the service locally and/or move it to another machine without updating the Silverlight client. You’ve heard that the Windows Azure Service Bus allows you to do this more easily so you decide to use it. This is your current service configuration (notice the localhost address!). Code Snippet <service name="SilverlightAndAppFabric.TheService" >   <endpoint name="HELLO"             address="http://localhost:1234/rest"             behaviorConfiguration="REST"             binding="webHttpBinding"             bindingConfiguration="default"             contract="SilverlightAndAppFabric.IHello" /> </service> What you now need to do is to move it to the AppFabric Service bus. This is easy. Of course you need to get a subscription for Windows Azure and set up the AppFabric service bus… Look for somewhere else on this, there’s lots of this around. Then you change the address, binding and behavior like this: You need an endpoint behavior, because your service needs to authenticate to the service bus (so they can send you the bill): Code Snippet <endpointBehaviors>   <behavior name="REST">     <webHttp />     <transportClientEndpointBehavior>       <clientCredentials>         <sharedSecret           issuerName="owner"           issuerSecret="---your secret key here please---" />       </clientCredentials>     </transportClientEndpointBehavior>   </behavior> </endpointBehaviors> You (might) need a binding configuration to allow clients to access your service anonymously: Code Snippet <webHttpRelayBinding>   <binding name="default" >     <security relayClientAuthenticationType="None">     </security>   </binding> </webHttpRelayBinding>   And of course you need to change the endpoint to use the WebHttpRelayBinding: Code Snippet <endpoint name="HELLO"           address="https://u2utraining.servicebus.windows.net/rest"           behaviorConfiguration="REST"           binding="webHttpRelayBinding"           bindingConfiguration="default"           contract="SilverlightAndAppFabric.IHello" />   This should to the trick. Yes, when you try the REST service using Internet Explorer you get back the intended result. Now you update the address in your Silverlight application to use the service bus endpoint: This is the old call: Code Snippet wc.DownloadStringAsync(new Uri("http://localhost:1234/rest/hello"));   And you change it to: Code Snippet wc.DownloadStringAsync(new Uri("https://u2utraining.servicebus.windows.net/rest/hello"));   Please note the switch to https and the service bus address. You run your Silverlight client and it fails with some strange security error! The problem is that Silverlight will try to access the clientaccesspolicy.xml file from your new address. Since this is now the service bus this will not work. To solve it you simply add another REST endpoint that will return the clientaccesspolicy from this Uri. Start with the service contract: Code Snippet [ServiceContract] public interface IClientAccessPolicy {   [OperationContract]   [WebGet(UriTemplate = "clientaccesspolicy.xml")]   Message GetPolicyFile(); } Implement it: Code Snippet public Message GetPolicyFile() {   WebOperationContext.Current.OutgoingRequest.ContentType = "text/xml";     using (FileStream stream = File.Open("clientaccesspolicy.xml", FileMode.Open))   {     using (XmlReader xmlReader = XmlReader.Create(stream))     {       Message m = Message.CreateMessage(MessageVersion.None, "", xmlReader);       using (MessageBuffer buffer = m.CreateBufferedCopy(1000))       {         return buffer.CreateMessage();       }     }   } }   And make sure it returns the right policy. This is what gave me a lot of headache, so here it is: Code Snippet <?xml version="1.0" encoding="utf-8"?> <access-policy>   <cross-domain-access>     <policy>       <allow-from http-request-headers="*">         <domain uri="http://*"/>         <domain uri="https://*"/>       </allow-from>       <grant-to>         <resource path="/" include-subpaths="true"/>       </grant-to>     </policy>   </cross-domain-access> </access-policy>   Pay special attention to the allow-from element. By default this will allow SOAP calls, not REST calls. For explanations read the documentation. You might want to edit it anyway. Now add a similar REST endpoint, making sure the clientaccesspolicy is at the root level: Code Snippet <endpoint name="CLIENTACCESSPOLICY"           address="https://u2utraining.servicebus.windows.net"           behaviorConfiguration="REST"           binding="webHttpRelayBinding"           bindingConfiguration="default"           contract="SilverlightAndAppFabric.IClientAccessPolicy" />   Done! A working example (you will have to change the client credentials to your own) can be downloaded from the U2U site here.