Installing requirements using Azure startup tasks

Windows Azure deploys your azure web or worker role in the cloud, on a machine with Windows Server 2008 and .NET 4 pre-installed. But what if you need an additional requirement? What if you need to install some performance counter, or if you need some other piece of software like the media encoder? Then you can use a startup task to get the job done. In this blog post you will create a simple web role using ASP.NET MVC 3, then add a startup task to ensure MVC 3 is also installed on the Azure instance. For this walkthrough you’ll need Visual Studio 2010 and ASP.NET MVC 3. You’ll also need the standalone MVC 3 installer, which you can find here. Step 1: Create the Azure solution. Start by creating a new Cloud project, call it UsingStartupTasks. Click Ok. Don’t add any role just yet, so click Ok in the next screen. MVC 3 is not available from the “New Windows Azure project” dialog, so we’ll need to use another way to get an ASP.NET project in Azure… Now add a new ASP.NET MVC 3 project, calling it HelloMVC3. Select the Internet Application template, leave the rest to its defaults, then press Ok. Right-click the Roles folder beneath your cloud project and select Add->Web Role Project in Solution… Select the HelloMVC3 project in the next screen and hit Ok. Adding the Startup task Add a new folder StartupTasks to your MVC project and add the MVC installer AspNetMVC3Setup.exe to it. Open notepad.exe (don’t add the following file using Visual Studio because it will add a Byte Order Mark and the Azure runtime doesn’t like that) and create a new batch file called installmvc.cmd in the StartupTasks folder. To add it to the Visual Studio project first click on the Show All Files button in the solution explorer, and then right-click the installmvc.cmd file and select Include In Project. Do the same for the AspNetMVC3Setup.exe installer. We’ll use this batch file to execute the installer as follows: enter following in installmvc.cmd: %~dp0AspNetMVC3Setup.exe /q /log %~dp0mvc3_install.htm exit /b 0   The %~dp0 actually returns the install folder for your azure project, so the first line will run the standalone MVC3 installer, this will write any install problems to a log file called mvc3_install.htm. The %~dp0 is used to get the directory containing the startup tasks (an azure server local copy of the StartupTasks folder). The first statement will do a silent (quiet) install of MVC3, and the next line will return a success error code. Make sure both files have a build action of “none” and Copy to Output Directory set to “Copy Always”. Editing the Service definition file Finally you need to open the ServiceDefinition.csdef file and add the task to it: <?xml version="1.0" encoding="utf-8"?> <ServiceDefinition name="UsingStartupTasks" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition"> <WebRole name="HelloMVC3"> <Sites> <Site name="Web"> <Bindings> <Binding name="Endpoint1" endpointName="Endpoint1" /> </Bindings> </Site> </Sites> <Endpoints> <InputEndpoint name="Endpoint1" protocol="http" port="80" /> </Endpoints> <Imports> … </Imports> <Startup> <Task commandLine="StartupTasks\installmvc.cmd" executionContext="elevated" taskType="simple" /> </Startup> </WebRole> </ServiceDefinition>   In this <Startup> element you can add any number of Task elements. Each represents a commandLine that will be executed on the installing instance prior to installing your azure project. You also get a couple of options, first you can specify the executionContext: this can be elevated or limited. Elevated gives you administrator-like privileges and is ideal for installers (which normally only work with admin privileges). Limited gives you “normal user” privileges. You can also choose the taskType. This is how the task will be executed. You get three options: simple, background or foreground. Simple means that the installer will wait for this task to complete before continuing with the next task (or the actual installation of your azure role). Background and foreground means the same thing like in threading. Background will not block the installer from continuing with the next task, and these will run in parallel. So to get MVC3 installed we need to run elevated, and we don’t want installation to continue before MVC3 has been installed, so we choose simple as the task type… Deploying to Azure Right-click on the Cloud project and select Publish… The Deploy Windows Azure project dialog opens: Select your credentials, environment and storage account. You may need to do some setup for this to work… Optionally you can configure remote desktop connections in case something went wrong, this will make it easier to see if MVC 3 was indeed installed. Click on Ok and wait… Deployment in Visual Studio should start: Wait some more till complete (because the startup tasks are executing this will take a long time): Click on the Url. You should now see the MVC 3 screen! In my next blog post I will show you how to turn this startup task into an azure startup plugin, which will make it easier to re-use this startup task.

Storing message in table storage

In my previous post I looked at getting started with table storage, in this one we will create a table for our entities and store them. As you’ll see, quite easy! So, to store an entity in table storage you start by creating a TableServiceEntity derived class (recap from previous post): public class MessageEntity : TableServiceEntity { public MessageEntity() { } public MessageEntity(string partitionKey, string rowKey, string message) : base( partitionKey, rowKey ) { Message = message; } public string Message { get; set; } }   You also need a table class, this time deriving from TableServiceContext: public class MessageContext : TableServiceContext { public const string MessageTable = "Messages"; public MessageContext(string baseAddress, StorageCredentials credentials) : base(baseAddress, credentials) { } } TableServiceContext requires a base address and credentials, and since our class derives from it we need a constructor taking the same arguments. I also have a const string for the table name. If the table doesn’t exist yet you should create it. This is easy, just add the code to create the table in the MessageContext’s static constructor: static MessageContext() { var tableClient = MyStorageAccount.Instance.CreateCloudTableClient(); tableClient.CreateTableIfNotExist(MessageTable); } A static constructor is automatically called when you use the type. Note that I use the MyStorageAccount class, which uses the same static constructor trick to initialize the storage account. public static class MyStorageAccount { public static string DataConnection = "DataConnection"; public static CloudStorageAccount Instance { get { return CloudStorageAccount.FromConfigurationSetting(DataConnection); } } static MyStorageAccount() { CloudStorageAccount.SetConfigurationSettingPublisher( (config, setter) => { setter( RoleEnvironment.IsAvailable ? RoleEnvironment.GetConfigurationSettingValue(config) : ConfigurationManager.AppSettings[config] ); RoleEnvironment.Changing += (_, changes) => { if (changes.Changes .OfType<RoleEnvironmentConfigurationSettingChange>() .Any(change => change.ConfigurationSettingName == config)) { if (!setter(RoleEnvironment.GetConfigurationSettingValue(config))) { RoleEnvironment.RequestRecycle(); } } }; }); } }   Now we are ready to add the code to create and add a message to our table. Add following code to MessageContext: public static MessageEntity CreateMessage( string message ) { return new MessageEntity(MessageTable, Guid.NewGuid().ToString(), message); } public void AddMessage(MessageEntity msg) { this.AddObject(MessageTable, msg); this.SaveChanges(); }   The CreateMessage method creates a new MessageEntity instance, with the same partition key (I don’t expect to store a lot of messages), a unique Guid as the row key, and of course the message. The AddMessage method adds this entity to the table, and then calls SaveChanges to send the new row to the table. This mechanism uses the same concepts as WCF Data Services. In the previous post we created a web site with a textbox and a button. Implement the button’s click event as follows: protected void postButton_Click(object sender, EventArgs e) { string message = messageText.Text; var msg = MessageContext.CreateMessage(message); context.AddMessage(msg); } This will allow you to add messages to storage. Before you can run this sample, you also need to setup the connection. Double-click the CloudMessages project beneath the Roles folder. This open the project’s configuration window. Select the Settings tab and add a “DataConnection” setting, select “Connection String” as the type and then select your preferred storage account. In the beginning it is best to use development storage, and that is what I did here: After running the web site you are of course wondering if your messages were actually added. So let’s add some code and UI to display the messages in the table. Start by adding the following property to MessageContext: public IQueryable<MessageEntity> Messages { get { return CreateQuery<MessageEntity>(MessageTable); } } This property returns an IQueryable<MessageEntity>, which is then used by LINQ for writing queries. To actual query is performed in our web page class. But first we need to add some UI to display the messages. Add a repeater control beneath the TextBox and Button: <asp:Content ID="BodyContent" runat="server" ContentPlaceHolderID="MainContent"> <p> <asp:TextBox ID="messageText" runat="server" Width="396px"></asp:TextBox> <asp:Button ID="postButton" runat="server" OnClick="postButton_Click" Text="Post message" /> </p> <p> <asp:Repeater ID="messageList" runat="server"> <ItemTemplate> <p> <%# ((MessagesLib.MessageEntity) Container.DataItem).Message %> </p> </ItemTemplate> </asp:Repeater> </p> </asp:Content>   Now that we can display the messages, let’s add a LoadMessages method below the click event handler of the page: private void LoadMessages() { var query = from msg in context.Messages select msg; messageList.DataSource = query.ToList() .OrderBy(m => m.Timestamp) .Take(10); messageList.DataBind(); } Call this method in the Load event of the page: protected void Page_Load(object sender, EventArgs e) { if (!IsPostBack) { LoadMessages(); } }   And again in the button’s click event: protected void postButton_Click(object sender, EventArgs e) { string message = messageText.Text; var msg = MessageContext.CreateMessage(message); context.AddMessage(msg); LoadMessages(); }   Run. Add some messages, and see them listed (only the first 10 messages will be displayed, change to query as you like).

Introducing Windows Azure Table Storage

Windows Azure storage gives you several persistent and durables storage options. In this blog post I want to look at Table storage (which I prefer to call Entity storage because you can store any mix of entities  in these tables; so you can store products AND customers in the same table). For this walkthrough you’ll need the AZURE SDK and setup for development… 1. Getting ready Start Visual Studio 2010 and create a new Azure project called MessageServiceWithTables: In the New Windows Azure Project dialog select the ASP.NET Web Role and press the > button, then rename the project to CloudMessages: Replace the content of default.aspx with the following: <%@ Page Title="Home Page" Language="C#" MasterPageFile="~/Site.master" AutoEventWireup="true" CodeBehind="Default.aspx.cs" Inherits="CloudMessages._Default" %> <asp:Content ID="HeaderContent" runat="server" ContentPlaceHolderID="HeadContent"> </asp:Content> <asp:Content ID="BodyContent" runat="server" ContentPlaceHolderID="MainContent"> <p> <asp:TextBox ID="messageText" runat="server" Width="396px"></asp:TextBox> <asp:Button ID="postButton" runat="server" OnClick="postButton_Click" Text="Post message" /> </p> </asp:Content> 2. Creating the table store entity classes Add a new Class Library project to your solution, and call it MessagesLib. Delete class1.cs. Add a new class called MessageEntity. 2.1 Creating the entity class We want to derive this class from TableServiceEntity, but first we need to add a couple of references. So select Add Reference… on the library project. Browse to Program Files\Windows Azure SDK\v1.4\ref and select following libraries (or simply select them all): You also need to add a reference to System.Data.Services.Client (I’m using Power Tools, so the Add Reference looks different (excuse me, better!): Now you’re ready to add the MessageEntity class deriving from the TableServiceEntity base class. public class MessageEntity : TableServiceEntity { public MessageEntity() { } public MessageEntity(string partitionKey, string rowKey, string message) : base( partitionKey, rowKey ) { Message = message; } public string Message { get; set; } }   This base class has three properties used by table storage: the partition key, the row key and the timestamp: The partition key is used as follows: all entities sharing the same partition key share the same storage device, they are kept together. This make querying these objects faster. But on the other hand, entities with different partition keys can be stored on different machines, allowing queries to be distributed over these machines when there are many instances. So choosing the partition key is a tricky thing, and there are no automated tools to help you here. Some people will use buckets (like from 0 to 9) and evenly distribute their instances over all buckets. The row key makes the entity unique and the timestamp is used for concurrency checking (optimistic concurrency). So, what do we need to store? Since we just want to store messages we add a single Message property and constructors for easy instantiation. In the next blog post we’ll be looking at creating the table in table storage and inserting new data…                  

Debugging those nasty Windows Azure startup-code problems with IntelliTrace

1. Introducing Intelli-Trace How do engineers figure out what caused a plane crash? One of the things they use is the black-box recording, which recorded all mayor data from the plane prior to the crash. This recording allows them to step back in time and analyze step-by-step what happened. Microsoft Visual Studio 2010 Ultimate also has a black-box for your code, called IntelliTrace. While your code is running, Intellitrace writes a log file (called an iTrace file), and you can analyze this using Visual Studio Ultimate. Windows Azure also allows you to enable Intellitrace on the server running your code, and this is ideal to figure out why your code is crashing on the server, especially early code (because you cannot attach the debugger). In the next part you’ll walkthrough this process. But first you need to download the AzureAndIntelliTrace solution. 2. Deploying the solution We’ll start by deploying the solution, so open the AzureAndIntelliTrace solution with Visual Studio. Right-click the Azure project and choose Publish… The Deploy Windows Azure project dialog should open. Make sure you check the “Enable IntelliTrace for .NET 4 roles” checkbox. Let’s have a look at settings, so click the “Settings…” link. Open the Modules tab: You can just leave everything to its default settings, but you could remove the StorageClient library if you would want to use intellitrace to track down a storage problem… Wait for the deployment until it says “Busy…”. If something goes wrong during the startup fase of your role instance with IntelliTrace enabled, Azure will keep the role busy so you can download the iTrace file. So you’re waiting for this: Then it is time to download the iTrace file. You can do that from the Server Explorer window. Open the Windows Azure Compute tree item until you reach the instance (note the Busy state!): The instance name will also mention whether or not it is IntelliTrace enabled. Now you can right click the instance to download the iTrace file: Wait for download to complete, the iTrace file should open automatically in Visual Studio: Scroll down until you reach the Exception Data section. You should see that you got a FileNotFoundException, caused because it couldn’t find the Dependencies assembly: 3. Fixing the dependency problem This kind of problem is easily solved, but first we need to stop this deployment. Go back to the Windows Azure Activity Log and right-click the deployment. Choose “Cancel and remove”. The problem is that we have a reference to the dependencies assembly, but when deployed to Azure it is not copied onto the instance. Go back to the solution and open the DebugThis project. Open the References folder and select the Dependencies assembly. In the properties window set the “Copy Local” property to true. Try redeploying again. Now we will have another problem, so wait for the instance to go Busy again… Download the iTrace file again. Scroll down to the exceptions section, select the FormatException and click the Start Debugging button. IntelliTrace will put you on the offending line. It is easy to see that the string.Format is missing an argument… You can start the debugger by clicking the “Set Debugger Context Here” button in the gutter. Now you can step back using intellitrace… As you can see, IntelliTrace is great for figuring out this kind of problem, especially if your code works in the compute emulator, but doesn’t on the real Azure server instance…

Building a Storage Account helper class (and forget about it)

When you use storage with the managed API’s, you always need to use a storage account, and make sure you setup the whole thing correctly. The way to do this is slightly different when building a web role versus a worker role, so I decided to tackle this problem by building a simple class that takes care of everything, and works the same in a web or worker role. All you need is to copy/reference this class in each of your projects. 1. The first problem – setting up the ConfigurationSettingPublisher The storage account uses a connection string. This connection string can come from the normal web.config (when you use storage in a local web site) or the Azure project’s ServiceConfiguration.cscfg. To allow all connections to be available in both environments the storage account uses a ConfigurationSettingPublisher, which is a delegate that retrieves the connection. Code Snippet CloudStorageAccount.SetConfigurationSettingPublisher(   (config, setter) =>   {     setter(       RoleEnvironment.IsAvailable ?         RoleEnvironment.GetConfigurationSettingValue(config)         :         ConfigurationManager.AppSettings[config]     );   });   Problem is where to put the code; in a web role you have to put the code in startup code, so you need to put this in Global.asax. In a worker role there is no such thing, so you put this in the worker role’s Start method. So, you might say, let’s also put this code in the Start method of the web role; this will run in another process so you end up without a proper initialized web role… So here is my solution: create a new class MyStorageAccount which uses a static constructor (also known as the type constructor) so call this code. A static constructor will automatically be called by the runtime when you first use the class. So this way you will always have correct initialization, and only when you need it: Code Snippet using System.Configuration; using Microsoft.WindowsAzure; using Microsoft.WindowsAzure.ServiceRuntime;   namespace MessagesLib {   public static class MyStorageAccount   {     public static string DataConnection = "DataConnection";       public static CloudStorageAccount Instance     {       get       {         return CloudStorageAccount.FromConfigurationSetting(DataConnection);       }     }       static MyStorageAccount()     {       CloudStorageAccount.SetConfigurationSettingPublisher(         (config, setter) =>         {           setter(             RoleEnvironment.IsAvailable ?               RoleEnvironment.GetConfigurationSettingValue(config)               :               ConfigurationManager.AppSettings[config]           );         });     }   } } This class uses the static constructor to setup the ConfigurationSettingPublisher, and also has an Instance property. This allows you to retrieve the storage account without always having to provide the connection configuration name. Most applications will only use a single connection, so this should work fine. So instead of writing this all the time: Code Snippet var account = CloudStorageAccount.FromConfigurationSetting("DataConnection"); var tableClient =   account.CreateCloudTableClient();   You can now write it like this: Code Snippet var account = MyStorageAccount.Instance; var tableClient =   account.CreateCloudTableClient();   Or even shorter: Code Snippet var tableClient =  MyStorageAccount.Instance.CreateCloudTableClient();   Much easier! 2. Handling configuration changes Next problem is when the configuration of the role is updated. In this case you need to check if you can handle this change, if not, you should restart the role. Typical code for this looks like this: Code Snippet RoleEnvironment.Changing += (_, changes) => {   if (changes.Changes              .OfType<RoleEnvironmentConfigurationSettingChange>()              .Any(change => change.ConfigurationSettingName == config))   {     if (!setter(RoleEnvironment.GetConfigurationSettingValue(config)))     {       RoleEnvironment.RequestRecycle();     }   } };   We register for the Changing event. When this event gets raised, you receive all the changes in a collection. You walk the collection to see if the configuration you’re interested in is part of the collection. In that case you need to re-invoke the ConfigurationSettingPublisher, and when that method returns false you need the recycle the role. Please note that this will only happen when running as a role (not local web site) so we can use the cloud version to retrieve configuration… So what is the result of this? No need to call special code in your Global.asax or worker role Start method. It just works!

Running multiple sites in one Windows Azure Web Role

Since the release of the Windows Azure SDK 1.3 it is possible to host multiple sites in one web role. In this blog post I will show you how to do this. 1. Creating the Azure project Start by creating a new Azure Cloud project. Add a single WebRole project (call it MultiSitesWebRole) to it: Hit Ok. Now we just want to make sure we see each different web site, so open the default.aspx page and change the header, for example: Code Snippet <%@ Page Title="Home Page" Language="C#" MasterPageFile="~/Site.master" AutoEventWireup="true"     CodeBehind="Default.aspx.cs" Inherits="WebApplication1._Default" %>   <asp:Content ID="HeaderContent" runat="server" ContentPlaceHolderID="HeadContent"> </asp:Content> <asp:Content ID="BodyContent" runat="server" ContentPlaceHolderID="MainContent">     <h2>         This is the main site!     </h2>     <p>         To learn more about ASP.NET visit <a href="http://www.asp.net" title="ASP.NET Website">www.asp.net</a>.     </p>     <p>         You can also find <a href="http://go.microsoft.com/fwlink/?LinkID=152368&amp;clcid=0x409"             title="MSDN ASP.NET Docs">documentation on ASP.NET at MSDN</a>.     </p> </asp:Content>   Make sure the cloud project is set as the start project, and then run (F5) your solution. Your web browser should open and display the site. While this is still running open IIS manager. Open the list of sites, you should see the site for this solution (the name will be different): The Windows Azure Compute emulator actually uses IIS to run your web role by creating a site on your local machine. Stop your debugging session. 2. Adding the second site Right-click your solution and add another ASP.NET web project.calling it “TheSecondWebSite”. Update the default.aspx again to show that this is the second site: Code Snippet <%@ Page Title="Home Page" Language="C#" MasterPageFile="~/Site.master" AutoEventWireup="true"     CodeBehind="Default.aspx.cs" Inherits="WebApplication2._Default" %>   <asp:Content ID="HeaderContent" runat="server" ContentPlaceHolderID="HeadContent"> </asp:Content> <asp:Content ID="BodyContent" runat="server" ContentPlaceHolderID="MainContent">     <h2>         The second web site!     </h2>     <p>         To learn more about ASP.NET visit <a href="http://www.asp.net" title="ASP.NET Website">www.asp.net</a>.     </p>     <p>         You can also find <a href="http://go.microsoft.com/fwlink/?LinkID=152368&amp;clcid=0x409"             title="MSDN ASP.NET Docs">documentation on ASP.NET at MSDN</a>.     </p> </asp:Content>   Now open the cloud service definition file (ServiceDefinition.csdef). Look at the <Sites> element and its children. Copy the <Site> element to create another one. Modify the site name and add a physicalDirectory element set to the path of the second site. Inside the <Site> element look for the <Binding> element and add another attribute hostHeader set to www.contoso.com. Ok, I’m using www.contoso.com for testing purposes, you can use your own site url if you want. We’ll be changing the host file on your machine so you can test everything, but this won’t work for other people. If you really want to make this work you will need to make the necessary DNS entries to redirect your site url to the cloud url… ServiceDefinition.csdef Code Snippet <Sites>   <!-- First site -->   <Site name="Check" physicalDirectory="C:\...\FullIIS_MultipleSites\WebApplication1">     <Bindings>       <Binding name="Endpoint1" endpointName="Endpoint1" />     </Bindings>   </Site>   <!-- Second site -->   <Site name="Encore" physicalDirectory="C:\...\FullIIS_MultipleSites\WebApplication2">     <Bindings>       <Binding name="Endpoint1" endpointName="Endpoint1" hostHeader="www.contoso.com" />     </Bindings>   </Site> </Sites>   Before you run your solution we need to make sure that the hostHeader www.contoso.com is pointing to the local environment. You can do this by editing the hosts file, which can be found in <Windows>\System32\drivers\etc\hosts. You can edit this file using notepad. Make sure you have the following in this file: 127.0.0.1 www.contoso.com Run your solution. Your browser should show the first site: Note the port used by the first site (in this screenshot above this is port 81, the compute emulator will take the first available port above port 80). Now you can open the second site by browsing to www.contoso.com:81 (use your own port here). Before you stop debugging, open the IIS manager again, you should now see two sites (you may need to refresh the sites node): 3. Creating a nested virtual application Add another web project, calling it TheThirdSite. Change default.aspx again to reflect that this is the third site. Open the service definition again, and add the VirtualApplication element inside the second site: Servicedefinition.csdef Code Snippet <Sites>   <!-- First site -->   <Site name="Check"         physicalDirectory="C:\...\FullIIS_MultipleSites\WebApplication1">     <Bindings>       <Binding name="Endpoint1" endpointName="Endpoint1" />     </Bindings>   </Site>   <!-- Second site -->   <Site name="Encore"         physicalDirectory="C:\...\FullIIS_MultipleSites\WebApplication2">     <!-- Even a nested Virtual application -->     <VirtualApplication name="More"                         physicalDirectory="C:\...\FullIIS_MultipleSites\MvcApplication1">     </VirtualApplication>     <Bindings>       <Binding name="Endpoint1" endpointName="Endpoint1" hostHeader="www.contoso.com" />     </Bindings>   </Site> </Sites>   Run again and browse to www.contoso.com:81/More. You should see the third site. 4 Deploying and testing in the cloud Let’s test your project in the cloud. Deploy your solution: Once deployment is ready click on the link to open the first site. Now open a command prompt and use the ping command to retrieve the site’s IP address. Open the hosts file and modify the www.contoso.com to use the new IP address. Browse to www.contoso.com (no need to specify a port number now). You should see the second site. If not, you might need to clear your DNS cache using the command: ipconfig /flushdns net stop dnscache net start dnscache Browse to the nested virtual application at www.contoso.com/More: That’s it. You’ve just created your own web role in the cloud, hosting three different sites! To complete everything you now should go to your internet provider to make the necessary DNS entries to redirect your site url to the cloud url!

Remote debugging a Windows Azure Worker Role using Azure Connect, Remote desktop and the remote debugger, part 3

This is the third part of the “Remote debugging a Windows Azure Worker Role using Azure Connect, Remote desktop and the remote debugger”. In this part I will show you how to attach the remote debugger on your worker role and start debugging… Step 3: Connect to the remove debugger In the previous blog post we ended up copying the remote debugger to the worker role instance machine. Open the remote debugger folder (the one you just copied) and start the remote debugger by double-clicking the msvsmon.exe application. The remote debugger should start: Take note of the server name. Minimize the remote desktop session. Open Visual Studio ( the one you used to publish the worker role project). Select Debug->Attach to process… The Attach to Process windows should open. Type in the server name you got when you started the remote debugger. Be patient while Visual Studio downloads the list of processes from the remote server… WARNING: if you get authentication errors this means you are not using the same username and password from your local machine. After a while you should get a list of all processes: Select the WaWorkerHost.exe process and click the Attach button. Step 4: debug! Now you can add a breakpoint in your code and wait for the worker role to hit it. Again you will have to be patient… Happy debugging!

Remote debugging an Azure Worker role, using Azure Connect, Remote desktop and the remote debugger

So, you have some code running in a Windows Azure Worker role on Azure (not on the Azure Compute Emulator) and you need to debug what is going on. In this blog post I will show you how you can use Azure Connect, the remote desktop and the remote debugger to look inside your worker role. WARNING: this process can be a strain on your patience, since the debugger will work very slowly! But this is the only way to look inside a running worker role using the debugger… Step 1: Install your worker role on Azure, enabling remote desktop and Azure Connect First we need to install the worker role in an Azure instance, not forgetting to enable remote desktop and Azure Connect. To install Azure Connect, right-click on the worker role and select Properties. The Worker Role configuration window will open. Select the Virtual Network tab: Check the “Activate Windows Azure Connect” checkbox and copy-paste your activation token in the textbox. To get the activation token, go to the Windows Azure Portal, select the Virtual Network tab and select your subscription (you may need to request access to Windows Azure Connect because it is still in CTP). Click on the Get Activation Token button: Copy the activation token and paste in the project’s properties. If you have never done so before, click on the Install Local Endpoint button to get the local software installed on your computer. To install Remove Desktop, you need to first install a certificate on your Hosted Service, containing both private and public key pairs. This is easy to do using Visual Studio: Right-click your Azure project and select Publish… The Deploy Windows Azure project dialog will open. Click on the Configure Remote Desktop connection link, which opens the Remote Desktop Configuration dialog: Select a certificate from the drop-down, or create a new certificate using the <Create…> entry (the last entry in the dropdown combobox). To export the certificate (so you can install it using the Azure Portal) click on the View… button. Click on the Details tab, then on the Copy to File… button. The Certificate Export wizard will open… Click Next. Select Yes, export the private key option. Press Next. Keep the default .pfx option, press Next again. Enter a password and press Next. Select a file name, press Next then Finish. Now we’re ready to import the certificate in the Azure portal. Go to your hosted services, select the service of your choice and then select the Certificate folder. Click on the Add Certificate button: Open the previously created .pfx file, enter your password and press Create. This should add the certificate to the hosted service. Go back to Visual Studio (the Remote Desktop Configuration dialog should still be waiting for you). Select the certificate, enter your username and password. WARNING: You want to use your real username and password, this way you can authenticate to the remote server from Visual Studio. Otherwise you will get authentication errors! Enter some appropriate expiration date and click Ok to start publishing. Wait for Visual Studio to complete… This is step 1. In the next post I will show you how to copy the remote debugger to your worker role instance…

Getting Started Developing on Azure: Creating the Hosted Service and Storage Account

This is the next post on Getting Started developing on Azure with Visual Studio 2010 and Installing the Azure Management Certificates. Starting to develop on Azure with Visual Studio can be a lot to eat the first time. Getting Visual Studio ready, including installing the Management Certificates and so on is not a simple task (the first time anyway). So that is why I made this little walk-through series on starting to develop on Azure… In the first part you’ll build and run your first project on the Azure Compute Emulator, which is the local test version of the cloud. In the second part you’ll get Visual Studio ready to directly publish your solution in the cloud by installing the management certificates and in this part you will create the Hosted Service and storage account to actually deploy from Visual Studio. 1.3 Creating the Hosted Service Go back to the Azure Management Portal (windows.azure.com), and now click on the Hosted Services tab. To deploy we also need a hosted service, so click the New Hosted Service button to create one. This opens the Create a New Hosted Service dialog: Enter a name for your service and URL prefix. The prefix should be world-wide unique, so try adding your company name or something unique to you in it. If the chosen name is already taken you will be notified like this: Now we need to choose in which data-center the service should be deployed. We can also do this using an affinity group. An affinity group is an easy way to keep everything together (services can communicate with one another more efficiently (and less cost) if they are running in the same data center) so choose affinity group and select Create a new affinity group… from the dropdown list. Enter a name and datacenter; best take one near to your expected customers: North Europe is the datacenter in Amsterdam, which is closest to where I live, so I took that one. Feel free to take another… Click OK. We’ll deploy using Visual Studio, so before you click OK, select Do not deploy. Now click OK. 1.1.5 Creating the Storage Account Finally, before we can deploy we also need to create a Storage Account. Visual Studio will upload the package to storage and then instruct Azure to deploy it from storage. Go back to the Management Portal (windows.azure.com)l. Now go the the Storage Accounts tab. Click on the New Storage Account button. This will open the Create a New Storage Account dialog: Enter a unique (lowercase only) URL and use the same affinity group you used in the previous step. Hit OK. 1.1.6 Ready to deploy the web site Now we are ready to deploy! First we need to remove the local development trace listener from web.config configuration because it is not available in the cloud, but leave the diagnostic monitor listener: Code Snippet <listeners>   <add type="Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener, Microsoft.WindowsAzure.Diagnostics, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35"     name="AzureDiagnostics">     <filter type="" />   </add> </listeners>   Go back to Visual Studio, right click your azure project, and select Publish… Now select your hosted service and storage account, and hit OK (you might need to cancel and re-open this window to refresh it). Deployment should start and after a little while you should see the Windows Azure Activity Log: Wait until this is complete, it may take several minutes or longer depending on bandwidth… Click on the WebSite URL, the site should open. 1.1.7 Using Server Explorer Open Server Explorer. With the Azure SDK tools installed, you can look here at Azure Compute and Azure Storage. Click on the Windows Azure Compute tree item, then select Add Deployment Environment… Select your management certificate to list all hosted services there. Open the tree item and select Staging or Production. Click OK. From now on you can look at this environment from Visual Studio Server Explorer: You can try the same for storage. Later we will look at how we can use this to debug applications with Intelli-Trace. The end. My Next post (next Wednesday) is going to be on remote debugging worker roles running in the cloud…