Storing message in table storage

In my previous post I looked at getting started with table storage, in this one we will create a table for our entities and store them. As you’ll see, quite easy! So, to store an entity in table storage you start by creating a TableServiceEntity derived class (recap from previous post): public class MessageEntity : TableServiceEntity { public MessageEntity() { } public MessageEntity(string partitionKey, string rowKey, string message) : base( partitionKey, rowKey ) { Message = message; } public string Message { get; set; } }   You also need a table class, this time deriving from TableServiceContext: public class MessageContext : TableServiceContext { public const string MessageTable = "Messages"; public MessageContext(string baseAddress, StorageCredentials credentials) : base(baseAddress, credentials) { } } TableServiceContext requires a base address and credentials, and since our class derives from it we need a constructor taking the same arguments. I also have a const string for the table name. If the table doesn’t exist yet you should create it. This is easy, just add the code to create the table in the MessageContext’s static constructor: static MessageContext() { var tableClient = MyStorageAccount.Instance.CreateCloudTableClient(); tableClient.CreateTableIfNotExist(MessageTable); } A static constructor is automatically called when you use the type. Note that I use the MyStorageAccount class, which uses the same static constructor trick to initialize the storage account. public static class MyStorageAccount { public static string DataConnection = "DataConnection"; public static CloudStorageAccount Instance { get { return CloudStorageAccount.FromConfigurationSetting(DataConnection); } } static MyStorageAccount() { CloudStorageAccount.SetConfigurationSettingPublisher( (config, setter) => { setter( RoleEnvironment.IsAvailable ? RoleEnvironment.GetConfigurationSettingValue(config) : ConfigurationManager.AppSettings[config] ); RoleEnvironment.Changing += (_, changes) => { if (changes.Changes .OfType<RoleEnvironmentConfigurationSettingChange>() .Any(change => change.ConfigurationSettingName == config)) { if (!setter(RoleEnvironment.GetConfigurationSettingValue(config))) { RoleEnvironment.RequestRecycle(); } } }; }); } }   Now we are ready to add the code to create and add a message to our table. Add following code to MessageContext: public static MessageEntity CreateMessage( string message ) { return new MessageEntity(MessageTable, Guid.NewGuid().ToString(), message); } public void AddMessage(MessageEntity msg) { this.AddObject(MessageTable, msg); this.SaveChanges(); }   The CreateMessage method creates a new MessageEntity instance, with the same partition key (I don’t expect to store a lot of messages), a unique Guid as the row key, and of course the message. The AddMessage method adds this entity to the table, and then calls SaveChanges to send the new row to the table. This mechanism uses the same concepts as WCF Data Services. In the previous post we created a web site with a textbox and a button. Implement the button’s click event as follows: protected void postButton_Click(object sender, EventArgs e) { string message = messageText.Text; var msg = MessageContext.CreateMessage(message); context.AddMessage(msg); } This will allow you to add messages to storage. Before you can run this sample, you also need to setup the connection. Double-click the CloudMessages project beneath the Roles folder. This open the project’s configuration window. Select the Settings tab and add a “DataConnection” setting, select “Connection String” as the type and then select your preferred storage account. In the beginning it is best to use development storage, and that is what I did here: After running the web site you are of course wondering if your messages were actually added. So let’s add some code and UI to display the messages in the table. Start by adding the following property to MessageContext: public IQueryable<MessageEntity> Messages { get { return CreateQuery<MessageEntity>(MessageTable); } } This property returns an IQueryable<MessageEntity>, which is then used by LINQ for writing queries. To actual query is performed in our web page class. But first we need to add some UI to display the messages. Add a repeater control beneath the TextBox and Button: <asp:Content ID="BodyContent" runat="server" ContentPlaceHolderID="MainContent"> <p> <asp:TextBox ID="messageText" runat="server" Width="396px"></asp:TextBox> <asp:Button ID="postButton" runat="server" OnClick="postButton_Click" Text="Post message" /> </p> <p> <asp:Repeater ID="messageList" runat="server"> <ItemTemplate> <p> <%# ((MessagesLib.MessageEntity) Container.DataItem).Message %> </p> </ItemTemplate> </asp:Repeater> </p> </asp:Content>   Now that we can display the messages, let’s add a LoadMessages method below the click event handler of the page: private void LoadMessages() { var query = from msg in context.Messages select msg; messageList.DataSource = query.ToList() .OrderBy(m => m.Timestamp) .Take(10); messageList.DataBind(); } Call this method in the Load event of the page: protected void Page_Load(object sender, EventArgs e) { if (!IsPostBack) { LoadMessages(); } }   And again in the button’s click event: protected void postButton_Click(object sender, EventArgs e) { string message = messageText.Text; var msg = MessageContext.CreateMessage(message); context.AddMessage(msg); LoadMessages(); }   Run. Add some messages, and see them listed (only the first 10 messages will be displayed, change to query as you like).

WCF and large messages

This week I’ve been training a couple of people on how to use .NET, WCF4, Entity Framework 4 and other technologies to build an Enterprise Application. One of the things we did was return all rows from a table, and this table contains about 2500 rows. We’re using the Entity Framework 4 self-tracking entities which also serialize all changes made it the objects. And I kept getting this error: The underlying connection was closed: The connection was closed unexpectedly. Server stack trace:    at System.ServiceModel.Channels.HttpChannelUtilities.ProcessGetResponseWebException(WebException webException, HttpWebRequest request, HttpAbortReason abortReason)    at System.ServiceModel.Channels.HttpChannelFactory.HttpRequestChannel.HttpChannelRequest.WaitForReply(TimeSpan timeout)    at System.ServiceModel.Channels.RequestChannel.Request(Message message, TimeSpan timeout)    at System.ServiceModel.Dispatcher.RequestChannelBinder.Request(Message message, TimeSpan timeout) At first I thought it was something to do with the maximum message size, another kind of error you get when sending large messages. The maximum message size quota for incoming messages (65536) has been exceeded. To increase the quota, use the MaxReceivedMessageSize property on the appropriate binding element. Server stack trace:    at System.ServiceModel.Channels.HttpInput.ThrowMaxReceivedMessageSizeExceeded()    at System.ServiceModel.Channels.HttpInput.GetMessageBuffer()    at System.ServiceModel.Channels.HttpInput.ReadBufferedMessage(Stream inputStream) This one is easy to fix (although a little confusing because you have to configure the binding of the receiving side of the message, which is most of the time the client: But doing this didn’t help. So what was it? I knew the size of the message couldn’t be the problem, because I’d sent way bigger messages before. Maybe there was something in the contents that made the DataContractSerializer crash? Checking this is easy, I wrote a little code that would make the serializer write everything to a stream and see what happens. Works fine. Hmmm. What could it be? So I went over the list of properties of the DataContractSerializer. I has a MaxItemsInObjectGraph property. Maybe that was it, but how can I change this number? Looking at the behaviors I found it. What you need to do when you send a large number of objects is you have to increate this number, which is easy. At the server side you use the DataContractSerializer service behavior and set its value to a large enough number:   At the clients side you use the DataContractSerializer endpoint behavior. That fixed my problem.

Building an Enterprise Application with Entity Framework 4

Entity Framework 3 was a bit of a disappointment when it came to supporting enterprise applications. For me the major reason was the fact that entities used by EF required deriving from a class which is part of EF, thus coding the EF requirement into your Business Logic Layer (BLL) and presentation layer. EF 4 is still under development, but already they’re making a lot of compensation for this with their support for POCO (Plain Old Clr Object) and self tracking objects. POCO vs. Self Tracking A self tracking object is an object that has state where you can check what has happened to the object (unmodified, new, modified, deleted). The advantage of this kind of object is simplicity for the user, because the object does all the tracking. Of course this means more work building the object itself, but this can be solved using T4 templates. A POCO is really nothing more than a data carrier object, without any tracking support. Simplicity means maximum portability, especially if you use DataContracts. For the rest of this post I’ll be using self tracking objects, generated through EF 4. I’ll also be using the EF feature CTP 2. Using EF 4 to generate the self tracking objects Start by creating a new WinForms project (of WPF, Silverlight, whatever). Add another library project for your data access layer (DAL) and another one for your entities: Normally I would also add a business logic layer (BLL) but for simplicity I’ll leave it out for now. Now add a new Entity Data Model to your DAL project. Select the Northwind database, then select the Categories and Products tables: This way you end up with this model: Please note that my tables/entities each have an extra column, the Version column. This is a TimeStamp column used to detect concurrent updates. To tell EF to use this column for concurrency, set its Concurrency Mode property to Fixed. This is typically the best way to handle concurrent updates. Right-click your entity model’s background, then select the Add Code Generation Item… menu choice: This alternative code generation will add two T4 templates to the DAL project (using the .tt extension) The ProductsModel.Context.tt is an EF dependent template, so leave it in the DAL project, but the ProductsModel.Types.tt contains the EF independent types which actually are self-tracking entities. Move this template to the Entities project: Watch out, your project will not build until you set following references (diagram made with Visual Studio’s new UML Component Diagram) : This diagram also includes the BLL layer, our solution doesn’t, but if you want, feel free! If you’re using VB.NET, you should also add the Product.Entities namespace to your list of imports of the DAL project: Now you’re ready to implement the DAL layer, so add a ProductsDAL class as follows: 1: Public Class ProductsDAL 2:  3: Public Function GetListOfCategories() As List(Of Category) 4: Using db As New NorthwindEntities 5: Return db.Categories.ToList() 6: End Using 7: End Function 8:  9: Public Sub UpdateCategory(ByVal cat As Category) 10: Using db As New NorthwindEntities 11: db.Categories.ApplyChanges(cat) 12: db.SaveChanges() 13: End Using 14: End Sub 15:  16: End Class Now let’s add some controls and data bind them with Window Forms. For this I use the Data Sources window. Open it and add another data source. Select an object data source: Then select your Category of Product entity, which you should find in the Products.Entities project: Your data source window should now display your entities: Right-click category and select Customize… from the drop-down list. Now you can select Listbox as the control to use. Drag the Category entity to your form to create a listbox and bindingSource: Add two buttons, the Load and Save button. Implement the first button to retrieve the list of categories from the DAL: 1: Dim dal As New ProductsDAL 2: CategoryBindingSource.DataSource = dal.GetListOfCategories() And implement the save button as follows: 1: Dim dal As New ProductsDAL 2: Dim cat As Category = TryCast(CategoryBindingSource.Current, Category) 3: dal.UpdateCategory(cat) Run the solution and click on load. The listbox should fill with categories and the window should look like this (you might first want to copy the connectionstring in the DAL’s project .config to the form’s config): Note the “Change Tracking Enabled”. Check the checkbox if you want to update an object, this will enable the self tracking. Make a change, then click Save. This should update the database. Open two instances of this application. Load in both, then change the same record in both (with tracking enabled). Save both. The second Save should fail because of concurrency checking. Done!