U2U Blog Center

For developers and other creative minds

Welcome at the U2U Blog Center

U2U has built up the last 18 years a huge amount of expertise in the Microsoft .NET Framework. Expertise they have always shared in the community. Below you can find a list of the blogs maintained by U2U trainers.

Using the Prism 4.0 Event Aggregator

This article illustrates the problems that occur when you use regular events for intercomponent communication in a composite application. It shows how to solve these problems by implementing the Mediator design pattern, using the Event Aggregator from the Prism 4.0 framework as an example. Composite applications A composite application draws its functionality from a set of as-loosely-as-possible-coupled functional modules. At runtime, these modules are loaded into an application shell that provides common non-functional services, such as security, screen layout, and localization. The Managed Extensibility Framework (MEF) simplifies the design and development of this type of applications by providing module discoverability and composition capabilities: Discoverability Types that want to be discovered by MEF indicate this by using attributes. The following code snippets from the attached Visual Studio project show how a publisher and a subscriber make themselves available to the outside world (i.c. a viewmodel that will hook them together): [Export] public class Subscriber : INotifyPropertyChanged {     // ... } [Export(typeof(IPublisher))] public class Publisher : IPublisher {     // ... } Composition At runtime, the application's object graph is constructed by the so-called composition. All the application needs to do, is decorating the required parts with attributes. This is how the viewmodels in the sample project request for a publisher and a subscriber: [Import] public IPublisher Publisher { get; set; }   [Import] public Subscriber Subscriber { get; set; } The exported parts are looked up by MEF in catalogs, placed in a container, and then bound to the corresponding imports: AssemblyCatalog catalog = new AssemblyCatalog(Assembly.GetExecutingAssembly()); CompositionContainer container = new CompositionContainer(catalog); container.ComposeParts(this); MVVM Despite its strict modularity, a composite application is not really different from any other enterprise application that you would build in WPF, Silverlight, or WP7. For maintainability and testability purposes it definitely makes sense to develop the modules as well as the shell according to the MVVM-pattern. You'll end up with an application architecture that looks like a grid: functionality is layered vertically into modules, and technicality is layered horizontally into views, viewmodels, and models: Both axes should maintain a level of independence between constituants: individual modules may make no assumptions about other modules, and within a module the models shouldn't know their viewmodels, viewmodels shouldn't know their views, and views should have no code beside. The attached sample project has a simplified version of this architecture: it does everything in one project. It is however representative: all objects are injected through MEF, and the application is strictly MVVM. Here's a screenshot: Communication When the composite application is running, some components living in different architectural cells would want to talk to each other, e.g. viewmodels from different modules would want to play their role in an application-wide publisher-subscriber scenario. An implementation based on .NET events seems the obvious choice here. Regular Events Unfortunately, regular events require a reference from the subscriber to the publisher - at least while registering the event handler. After that, the subscriber doesn't need that reference to the publisher anymore. But even if you throw it away, the objects stay tightly connected: the publisher holds a reference to the subscriber in its delegate, as the 'target' of the method: This prevents the subscriber from being garbage collected even if the rest of the application isn't referencing it anymore. You can verify this easily by running the sample project. After you 'kill' the regular event subscriber, it will remain in memory: The only way to clean up memory is to unregister the event handler, e.g. in a Dispose action. Unfortunately this implies that the subscriber should hold a permanent reference to the publisher, now preventing the publisher to be cleaned up. In this scenario you should also exactly know when to dispose the subscriber. In a lot of cases -especially in composite applications- this is simply not possible. EventAggregator What you need in a composite application is a kind of central chat room, where all components anonymously appear, communicate, and disappear. This is exactly what the classic Gang-of-Four Mediator design pattern does: it describes how to moderate communication between components that aren't aware of each other's identity. The Prism 4.0 Event Aggregator is a robust implemenation of this design pattern. Prism is a framework and a set of guidelines on building composite applications in WPF, Silverlight, and WP7. You'll find more info here. In our sample application, the publisher and subscriber need to get a reference to a central event aggregator. We'll use MEF to do this. First the publisher and subscriber need to add the event aggregator to their list of required parts: [Import] public EventAggregator EventAggregator {     get { return this.eventAggregator; }     set { this.eventAggregator = value; } } Then you have to make sure that the event aggregator becomes available, e.g. by explicitly adding it to the MEF container (Prism has other ways to do this): AssemblyCatalog catalog = new AssemblyCatalog(Assembly.GetExecutingAssembly()); CompositionContainer container = new CompositionContainer(catalog); var batch = new CompositionBatch(); batch.AddExportedValue<EventAggregator>(new EventAggregator()); container.Compose(batch); container.ComposeParts(this); Now it's time to define the event's payload type - a Prism requirement: using Microsoft.Practices.Prism.Events;   public class PublishedEvent : CompositePresentationEvent<object> { } The publisher publishes an event with the following two lines: PublishedEvent pe = this.eventAggregator.GetEvent<PublishedEvent>(); pe.Publish(null); In the subscriber, registering an event is also a two-liner: PublishedEvent pe = this.eventAggregator.GetEvent<PublishedEvent>(); pe.Subscribe(o => this.Publisher_Published(o)); The event aggregator holds publishers and subscribers together via weak references, allowing participants to disappear without memory leaks. 'Killing' the subscriber in the sample application, will release its memory: Extra's I hope you're convinced of the need for a communication Mediator in some of your applications. If you go for Prism's event aggregator, you'll get a couple of extra's. In some cases it's necessary to 'upgrade' the default weak reference to a strong one. You can do this by setting the second parameter in the registration method to 'true': pe.Subscribe(     o => this.Publisher_Published(o),     true); A subscriber can specify on which thread he wants to be notified. This is handy when the event handler needs the UI thread. In a viewmodel from a MVVM application this should never be the case: the bindings should take care of this. Lastly, you can apply a filter on the subscription, so that some events will be ignored. The following -I admit: silly- example ignores events raised during 'odd' minutes: pe.Subscribe(     o => this.Publisher_Published(o),     ThreadOption.PublisherThread,     false,     o => DateTime.Now.Minute % 2 == 0); Source code Here's the source, the whole source, and nothing but the source: U2UConsult.Composite.Communication.Sample.zip (619,46 kb) Enjoy!

Optimistic concurrency using a SQL DateTime in Entity Framework 4.0

This article explains how to implement optimistic concurrency checking using a SQL Server DateTime or DateTime2 column. It's a follow-up of my previous article on using a TimeStamp column for that same purpose. In most -if not all- concurrency checking cases it actually makes more sense to use a DateTime column instead of a TimeStamp. The DateTime data types occupy the same storage (8 bytes) as a TimeStamp, or even less: DateTime2 with 'low' precision takes only 6 bytes. On top of that: their content makes sense to the end user. Unfortunately the DateTime data types are 'a little bit' less evident to use for concurrency checking: you need to declare a trigger (or a stored procedure) on the table, and you need to hack the entity model. A sample table Sample time! Let's start with creating a table to hold some data. Table definition Here's how the table looks like (the solution at the end of the article contains a full T-SQL script). The LastModified column will be used for optimistic concurrency checking: CREATE TABLE [dbo].[Hero](     [Id] [int] IDENTITY(1,1) NOT NULL,     [Name] [nvarchar](50) NOT NULL,     [Brand] [nvarchar](50) NULL,     [LastModified] [datetime] NULL,  CONSTRAINT [PK_Hero] PRIMARY KEY CLUSTERED (     [Id] ASC )) Trigger definition Unlike an Identity or a TimeStamp value, a DateTime value is not automatically generated and/or updated. So we have to give the database a little help, e.g. by creating a trigger for insert and update on that table: CREATE TRIGGER [dbo].[trg_iu_Hero] ON [dbo].[Hero] AFTER INSERT, UPDATE AS BEGIN    SET NOCOUNT ON;      UPDATE [dbo].[Hero]       SET LastModified = GETDATE()     WHERE Id IN (SELECT Id FROM inserted) END Alternatively, you could insert through a stored procedure.  A sample application I already prepared for you a small sample application. Here's how the main window looks like: Cute, isn't it ? The problem In the entity model, you have to make sure that the LastModified column has the correct settings (Fixed and Computed): Run the application with just the generated code. You will observe that when you update a record, the entity's LastModified property will NOT be updated. SQL Server Profiler will reveal that only an update statement is issued. The new value of LastModified is assigned by the trigger but NOT fetched: The solution In order to let the Entity Framework fetch the new value of the DateTime column -or whatever column that is modified by a trigger-, you need to hack the model's XML and manually add the following attribute in the SSDL: Somewhere in Redmond there will certainly be an architect who will provide an excuse for this behavior. To us developers, this sure smells like a bug. Anyway, if you re-run the application with the modified SSDL, the new DateTime value will appear after insert or update. SQL Server profiler reveals the extra select statement: Source Code Here's the source code, the whole source code, and nothing but the source code: U2UConsult.EF40.DateTimeConcurrency.Sample.zip (616,27 kb) Enjoy! Thank you   This article is dedicated to my 3-year old daughter Merel. Last week she briefly turned into a real angel, but then decided to come back. I want to thank from the bottom of my heart everybody who helped saving her life: her mama, her mammie, the MUG, and the emergency, reanimation, intensive care, and pediatric departments of the uza hospital.  

Self-Tracking Entities with Validation and Tracking State Change Notification

This article explains how to extend Self-Tracking Entities (STE) from Entity Framework (EF) 4.0 with validation logic and (tracking) state change notification, with just minimal impact on the T4 files. We'll build a two-tier application that submits local changes in a WPF application via a WCF service to a database table. The STE are extended with validation logic that is reusable on client and server. The client is notified when the change tracker of an entity changes its state. The tracking state is displayed to the end user as an icon. Here's the client application in action: For more details on the foundations of building N-Tier apps with EF 4.0, please read Peter Himschoots article. Source Code For the fans of the source-code-first approach, here it is: U2UConsult.SelfTrackingEntities.Sample.zip (622,23 kb) The structure of the solution is as follows: Preparation Database table First you need a SQL Server table. The provided source code contains a script to generate a working copy of the SalesReason table in the AdventureWorks2008 sample database. This is its initial content: Data Access Layer When you have it, it's time to fire up Visual Studio.NET. Create a WCF web service project with an ADO.NET Entity Model. Add the SalesReason2 table to the model (I renamed the entity and entity set to SalesReason and SalesReasons respectively). While you're in the designer, generate the code for the ObjectContext and the Self-Tracking Entities (right click in the designer, select "Add Code Generation Item", select "ADO.NET Self-Tracking Entity Generator"). Add the canonical service methods to fetch the full list of SalesReasons, and to add, delete, and update an individual SalesReason. Here's an example (I personally like to combine Add and Update operations in a Save method): public List<SalesReason> GetSalesReasons() {     using (AdventureWorks2008Entities model = new AdventureWorks2008Entities())     {         List<SalesReason> result = new List<SalesReason>();         result.AddRange(model.SalesReasons);         return result;     } }   public void DeleteSalesReason(SalesReason reason) {     using (AdventureWorks2008Entities model = new AdventureWorks2008Entities())     {         model.SalesReasons.Attach(reason);         model.SalesReasons.DeleteObject(reason);         model.SaveChanges();     } }   public SalesReason SaveSalesReason(SalesReason reason) {     using (AdventureWorks2008Entities model = new AdventureWorks2008Entities())     {         reason.ModifiedDate = DateTime.Now;         if (reason.ChangeTracker.State == ObjectState.Added)         {             model.SalesReasons.AddObject(reason);             model.SaveChanges();             reason.AcceptChanges();             return reason;         }         else if (reason.ChangeTracker.State == ObjectState.Modified)         {             model.SalesReasons.ApplyChanges(reason);             model.SaveChanges();             return reason;         }         else         {             return null; // or an exception         }     } }   Self Tracking Entities Add a new class library to the project, call it STE. Drag the Model.tt T4 template from the DAL to the STE project. Add a reference to serialization in the STE project. Add a reference to the STE in the DAL project. Everything should compile again now. WPF Client Add a WPF application to the solution. In this client project, add a reference to the STE, and a service reference to the DAL. Add a ListBox and some buttons, with straightforward code behind: private void RefreshSalesReasons() {     this.salesReasons = this.GetSalesReasons();       this.SalesReasonsListBox.ItemsSource = this.salesReasons; }   private ObservableCollection<SalesReason> GetSalesReasons() {     using (DAL.SalesReasonServiceClient client = new DAL.SalesReasonServiceClient())     {         ObservableCollection<SalesReason> result = new ObservableCollection<SalesReason>();         foreach (var item in client.GetSalesReasons())         {             result.Add(item);         }           return result;     } }   private void Update_Click(object sender, RoutedEventArgs e) {     SalesReason reason = this.SalesReasonsListBox.SelectedItem as SalesReason;     if (reason != null)     {         reason.Name += " (updated)";     } }   private void Insert_Click(object sender, RoutedEventArgs e) {     SalesReason reason = new SalesReason()     {         Name = "Inserted Reason",         ReasonType = "Promotion"     };     reason.MarkAsAdded();       this.salesReasons.Add(reason);       this.SalesReasonsListBox.ScrollIntoView(reason); }   private void Delete_Click(object sender, RoutedEventArgs e) {     SalesReason reason = this.SalesReasonsListBox.SelectedItem as SalesReason;     if (reason != null)     {         reason.MarkAsDeleted();     } }   private void Commit_Click(object sender, RoutedEventArgs e) {     using (DAL.SalesReasonServiceClient client = new DAL.SalesReasonServiceClient())     {         foreach (var item in this.salesReasons)         {             switch (item.ChangeTracker.State)             {                 case ObjectState.Unchanged:                     break;                 case ObjectState.Added:                     client.SaveSalesReason(item);                     break;                 case ObjectState.Modified:                     client.SaveSalesReason(item);                     break;                 case ObjectState.Deleted:                     client.DeleteSalesReason(item);                     break;                 default:                     break;             }         }           this.RefreshSalesReasons();     } } Now you're ready to extend the STE with some extra functionality. Validation It's nice to have some business rules that may be checked on the client (to provide immediate feedback to the user) as well as on the server (to prevent corrupt data in the database). This can be accomplished by letting the self-tracking entities implement the IDataErrorInfo interface. This interface just contains an indexer (this[]) to validate an individual property, and an Error property that returns the validation state of the whole instance. Letting the STE implement this interface can be easily done by adding a partial class file. The following example lets the entity complain if its name gets shorter than 5 characters: public partial class SalesReason : IDataErrorInfo {     public string Error     {         get         {             return this["Name"];         }     }       public string this[string columnName]     {         get         {             if (columnName == "Name")             {                 if (string.IsNullOrWhiteSpace(this.Name) || this.Name.Length < 5)                 {                     return "Name should have at least 5 characters.";                 }             }               return string.Empty;         }     } } If you add a data template to the XAML with ValidatesOnDataErrors=true in the binding, then the GUI will respond immediately if a business rule is broken. XAML: <TextBox    Width="180"    Margin="0 0 10 0">     <Binding        Path="Name"        Mode="TwoWay"        UpdateSourceTrigger="PropertyChanged"        NotifyOnSourceUpdated="True"        NotifyOnTargetUpdated="True"        ValidatesOnDataErrors="True"        ValidatesOnExceptions="True"/> </TextBox> Result: The same rule can also be checked on the server side, to prevent persisting invalid data in the underlying table: public SalesReason SaveSalesReason(SalesReason reason) {     if (!string.IsNullOrEmpty(reason.Error))     {         return null; // or an exception     }       ...   Notification of Tracking State Change By default, an STE's tracking state can be fetched by instance.ChangeTracker.State. This is NOT a dependency property, and its setter doesn't call PropertyChanged. Clients can hook an event handler to the ObjectStateChanging event that is raised just before the state changes (there is no ObjectStateChanged event out of the box). You're free to register even handlers in your client, but then you need to continuously keep track of which change tracker belongs to which entity: assignment, lazy loading, and (de)serialization will make this a cumbersome and error prone endeavour. To me, it seems more logical that an entity would expose its state as a direct property, with change notification through INotifyPropertyChanged. This can be achieved -again- by adding a partial class file: public partial class SalesReason {     private string trackingState;       [DataMember]     public string TrackingState     {         get         {             return this.trackingState;         }           set         {             if (this.trackingState != value)             {                 this.trackingState = value;                 this.OnTrackingStateChanged();             }         }     }       partial void SetTrackingState(string newTrackingState)     {         this.TrackingState = newTrackingState;     }       protected virtual void OnTrackingStateChanged()     {         if (_propertyChanged != null)         {             _propertyChanged(this, new PropertyChangedEventArgs("TrackingState"));         }     } } The only thing you need to do now, is to make sure that the SetTrackingState method is called at the right moment. The end of the HandleObjectStateChanging looks like a nice candidate. Unfortunately this requires a modification of the code that was generated by the T4 template. For performance reasons I used a partial method for this. This is the extract from the SalesReason.cs file: // This is a new definition partial void SetTrackingState(string trackingState); //   private void HandleObjectStateChanging(object sender, ObjectStateChangingEventArgs e) {     //     this.SetTrackingState(e.NewState.ToString()); // This is a new line     //       if (e.NewState == ObjectState.Deleted)     {         ClearNavigationProperties();     } } Just modifiying the generated code is probably not good enough: if later on you need to update your STE (e.g. after adding a column to the underlying table), the modifications will get overriden again. So you might want to modify the source code of the T4 template (search for the HandleObjectStateChanding method and adapt the source code). Fortunately this is a no-brainer: most of the T4 template is just like a C# source code file, but without IntelliSense. The rest of the file looks more like classic ASP - ugh. Anyway, you end up with a TrackingStatus property to which you can bind user interface elements, wiith or without a converter in between. In the sample application I bound an image to the tracking state: <Image    Source="{Binding        Path=TrackingState,        Converter={StaticResource ImageConverter}}"    Width="24" Height="24" /> Here's how it looks like: In general, I think there are not enough partial methods defined and called in the Entity Framework T4 templates. To be frank: 'not enough' is an understatement: I didn't find a single partial method.

Optimistic concurrency using a SQL Timestamp in Entity Framework 4.0

This article explains how to implement optimistic concurrency checking in the Entity Framework 4.0, using a SQL Server Timestamp column. But you could have derived that from its title. What is a Timestamp? Despite its name, the SQL Server Timestamp data type has nothing to do with time. DateTime2 on the other hand, is DateTime too [sorry, I couldn't resist that]. A timestamp is just an eight-bytes binary number that indicates the relative sequence in which data modifications took place in a database. The value in a column of the type Timestamp is always provided by SQL Server: it is calculated when a row is inserted, and augmented with each update to the row, to a ever increasing value that is unique in the whole database. The Timestamp data type was initially conceived for assisting in recovery operations, and you also use it to synchronize distributed databases: based on the timestamp you can detect the order in which data was added or updated, and replay a sequence of modifications. But the most common usage of timestamp is optimistic concurrency checking: when updating a row you can compare its current timestamp with the one you fetched originally. If the values are different, you know that someone else updated the row behind your back. And you know this without holding any locks on the server while you were busy, so its a very scalable solution. This type of concurrency checking is called 'optimistic': you assume that in most cases you will be able to successfully update without the need for conflict resolution. A sample of the geeky kind Let's create a table with such a column (the scripts-folder in the provided solution at the end of this article contains a full script): CREATE TABLE [dbo].[FerengiRule](     [ID] [int] NOT NULL,     [Text] [nvarchar](100) NOT NULL,     [Source] [nvarchar](100) NOT NULL,     [Timestamp] [timestamp] NOT NULL,  CONSTRAINT [PK_FerengiRule] PRIMARY KEY CLUSTERED (     [ID] ASC ) Populate it with your favorite 'Ferengi Rules of Acquisition'. You find all of these here. In a WPF solution, create an entity model, and add the table to it. You see that the timestamp column has Fixed as value for the Concurrency Mode property, so the column will appear in the WHERE-clause of any insert, update, or delete query, and Computed as value for the StoreGeneratedPattern property, so a new value is expected from the server after insert or update. Next, build an application with a fancy transparent startup screen, that allows you to open multiple edit windows on the same data. The startup screen could look like this: The mainwindow of the application contains just an editable grid on the table's contents. It allows you to set the concurrency resolution mode, upload local modifications to the database, get the current server data, and last but not least restore the table to its original contents (that's an extremely useful feature in this type of application). Here's how the window looks like: Visualizing a Timestamp value Just like GUIDs and technical keys, you should avoid showing timestamp values on the user interface. This demo is an exceptional to this general rule, so I built a TimestampToDouble converter to translate the eight-byte binary number to something more readable. I don't guarantee a readable output on an old active database where the timestamp value is very high, but it works fine for a demo on a fresh database: public class SqlTimestampToDoubleConverter: IValueConverter {     public object Convert(object value, Type targetType, object parameter, System.Globalization.CultureInfo culture)     {         if (value == null)         {             return null;         }           byte[] bytes = value as byte[];         double result = 0;         double inter = 0;         for (int i = 0; i < bytes.Length; i++)         {             inter = System.Convert.ToDouble(bytes[i]);             inter = inter * Math.Pow(2, ((7 - i) * 8));             result += inter;         }         return result;     }       public object ConvertBack(object value, Type targetType, object parameter, System.Globalization.CultureInfo culture)     {         throw new NotImplementedException();     } }   Concurrency Generated SQL Queries If you run your SQL Profiler while modifying data through via the main window, you'll see that the Entity Framework query's WHERE-clause contains the timestamp, and that after the update the new value is fetched for you: If there's no row to be updated, then someone else must have modified (or deleted) it. You can test that very easily with the sample application by opening multiple edit windows and playing around with the data. Client side conflict resolution When the entity framework discovers a concurrency violation when saving your changes, it appropriately throws an OptimisticConcurrencyException. It's then time for you to solve the conflict. In most cases, that means fetching the current values, sending these to the GUI, and let the end user decide what should happen. The data access layer code for inserts and updates will look like this: foreach (var rule in this.rules) {     try     {         switch (rule.ChangeTracker.State)         {             case ObjectState.Added:                 entities.FerengiRules.AddObject(rule);                 entities.SaveChanges();                 break;             case ObjectState.Modified:                 entities.FerengiRules.ApplyChanges(rule);                 entities.SaveChanges();                 break;             default:                 break;         }     }     catch (OptimisticConcurrencyException)     {         // Return Current Values         // ...     } }   Server side conflict resolution The Entity Framework also provides automated conflict resolution strategies that might be useful in some scenarios (although to be honest: I can't think of any). There's a Refresh method that you can use to decide whether it's the client version or the server (store) version that should be persisted when there's a conflict. Here's how the catch-block could look like: catch (OptimisticConcurrencyException) {     switch (this.conflictResolution)     {         case ConflictResolution.Default:             throw;         case ConflictResolution.ServerWins:             entities.Refresh(RefreshMode.StoreWins, rule);             entities.SaveChanges();             break;         case ConflictResolution.ClientWins:             entities.Refresh(RefreshMode.ClientWins, rule);             entities.SaveChanges();             break;         default:             break;     } }   Source Code Oops ... almost forgot: here's the source code of the whole thing: U2UConsult.EF40.OptimisticConcurrency.Sample.zip (470,96 kb) Enjoy!

WPF 4.0 and Windows 7: get more Functionality per Square Inch

The Windows 7 taskbar comes with nice features like a thumbnail preview with clickable thumb buttons, a progress bar in the taskbar item, and jump list items and task items in its context menu. Access to the shell integration API from the managed world is done through the System.Windows.Shell namespace that comes with .NET 4.0. This namespace decorates Window and Application with some new dependency properties, so you better make use of it. Here's an overview of the TaskbarItemIfo object model: The screenshot comes from a small application to demonstrate this object model. Here's how the main form looks like: The Chart and Pie buttons switch the displayed chart type, the Recalculate button simulates a long running operation (just an excuse for showing the progress bar). Here's the alternative view: TaskBarItemInfo When hovering with the mouse over the taskbar item, it shows only the relevant part of the screen in its thumbnail, and operational buttons - just like in the first image above. Everything is defined in the xaml as follows: <Window.TaskbarItemInfo>     <TaskbarItemInfo        x:Name="taskBarItemInfo1"        ThumbnailClipMargin="5 95 100 65"        Description="Taskbar Item Info Sample">         <TaskbarItemInfo.ThumbButtonInfos>             <ThumbButtonInfoCollection>                 <ThumbButtonInfo                    Click="Recalculate_Click"                    Description="Recalculate"                    ImageSource="/Assets/Images/refresh.png" />                 <ThumbButtonInfo                    Click="BarChart_Click"                    Description="Bar Chart"                    ImageSource="/Assets/Images/chart.png" />                 <ThumbButtonInfo                    Click="Pie_Click"                    Description="Pie Chart"                    ImageSource="/Assets/Images/pie.png"/>             </ThumbButtonInfoCollection>         </TaskbarItemInfo.ThumbButtonInfos>     </TaskbarItemInfo> </Window.TaskbarItemInfo> I hooked the thumb buttons to the same click event as their counterparts on the main form, but everything also works with command bindings. In fact, the taskbar item has more FSI than the original app, that's Functionality per Square Inch. Overlay The taskbar icon can be decorated with an extra image to inform the user about the state of the application. This overlay image can be set in XAML (through data binding, if you want), but in a lot of cases you'll want to change the image from source code. Here's how this is done: this.taskBarItemInfo1.Overlay = new BitmapImage(     new Uri(         @"..\..\Assets\Images\pie.png",         UriKind.Relative));   ProgressBar The task bar item can be decorated with a progress bar. This is how it's done from C#: this.taskBarItemInfo1.ProgressState = TaskbarItemProgressState.Normal; for (double d = 0; d < 80; d++) {     this.taskBarItemInfo1.ProgressValue = d / 100;     Thread.Sleep(50); } this.taskBarItemInfo1.ProgressState = TaskbarItemProgressState.Paused; Thread.Sleep(2000); this.taskBarItemInfo1.ProgressState = TaskbarItemProgressState.Error; for (double d = 80; d > 0; d--) {     this.taskBarItemInfo1.ProgressValue = d / 100;     Thread.Sleep(50); } this.taskBarItemInfo1.ProgressState = TaskbarItemProgressState.None; Here's the progress bar in action: ProgressValue is a value between 0 and 1, while ProgressState is one of the following: None: well - no progress bar, Normal: a green bar, Paused: a yellow bar, Error: a red bar, Indeteminate: a zigzagging green bar (very nice in combination with a wait cursor on the form) Source Code As always you get the full source code (VS 2010): U2UConsult.Windows7.Sample.zip (864,23 kb) Enjoy!

WCF Data Services 4.0 in less than 5 minutes

WCF Data Services 4.0 (formerly known as ADO.NET Data Services, formerly known as Astoria) is one of the ways to expose an Entity Data Model from Entity Framework 4.0 in a RESTful / OData way. This article explains how to create such a data service and how to consume it with a browser and with a WPF client. The Data Service Start with an empty ASP.NET Application: Add a WCF Data Service to it: Also add an Entity Data Model to the ASP.NET project: Follow the Model Wizard to create a model containing entities on top of the Employee and Person tables from the AdventureWorks2008 database: In the designer, you should have something like this: A lot of code was generated, let's add our own 50ct in the service's code behind. First let it inherit from DataService<AdventureWorks2008Entities>: public class WcfDataService : DataService<AdventureWorks2008Entities> { .. }   Then modify the InitializeService method as follows. This exposes all operations and grants all access rights (not really a production setting): public static void InitializeService(DataServiceConfiguration config) {     config.SetEntitySetAccessRule("*", EntitySetRights.All);     config.SetServiceOperationAccessRule("*", ServiceOperationRights.All);     config.DataServiceBehavior.MaxProtocolVersion = DataServiceProtocolVersion.V2; } Believe it or not, we're done (for the first part): the entity model is now exposed in a RESTful way. At the root URL you get an overview of the exposed entities. In the attached sample the root URL is "http://localhost:1544/WcfDataService.svc", but you may of course end up with another port number: At the "/Employees" address you find all employees: In your browser this list of employees may appear like this: This means it's time to -at least temporarily- disable your rss feed reading view. Here's how to do this in IE:     To reach an individual entity, just type its primary key value in parentheses at the end of the URL, like "http://localhost:1544/WcFDataService.svc/Employees(1)": You can navigate via the relationships between entities. This is how to reach the Person entity, connected to the first Employee. The URL is "http://localhost:1544/WcfDataService.svc/Employees(1)/Person": Other OData URI options options can be found here, including: Filtering: http://localhost:1544/WcfDataService.svc/Employees?$filter=JobTitle eq 'Chief Executive Officer' Projection: http://localhost:1544/WcfDataService.svc/Employees?$select=JobTitle,Gender Client side paging: http://localhost:1544/WcfDataService.svc/Employees?$skip=5&$top=2 Version 4.0 also includes support for server side paging. This gives you some control over the resources. Add the following line in the InitializeService method: config.SetEntitySetPageSize("Employees", 3); Only 3 employees will be returned now, even if the client requested all: A Client Enough XML for now. WCF Data Services also expose a client side model that allows you to use LINQ. Create a new WPF application: Add a Service Reference to the WFC Data Service: Decorate the Window with two buttons and a listbox. It should look more or less like this: The ListBox will display Employee entities through a data template (OK, that's XML again): <ListBox    Name="employeesListBox"    ItemTemplate="{StaticResource EmployeeTemplate}"    Margin="4"    Grid.Row="1"/> Here's the template. It not only binds to Employee properties, but also to Person attributes: <DataTemplate    x:Key="EmployeeTemplate">     <StackPanel>         <StackPanel Orientation="Horizontal">             <TextBlock                Text="{Binding Path=Person.FirstName}"                FontWeight="Bold"                Padding="0 0 2 0"/>             <TextBlock                Text="{Binding Path=Person.LastName}"                FontWeight="Bold"/>         </StackPanel>         <StackPanel Orientation="Horizontal">             <TextBlock                Text="{Binding Path=JobTitle}"                Width="180"/>             <TextBlock                Text="{Binding Path=VacationHours}"                Width="60"                TextAlignment="Right" />             <TextBlock                Text=" vacation hours taken." />         </StackPanel>     </StackPanel> </DataTemplate> The Populate-Button fetches some Employee entities together with their related Person entity, and binds the collection to the ListBox (in version 4.0 two-way bindings are supported for WPF): private void Populate_Click(object sender, RoutedEventArgs e) {     AdventureWorks2008Entities svc =         new AdventureWorks2008Entities(             new Uri("http://localhost:1544/WcfDataService.svc"));       this.employeesListBox.ItemsSource =         svc.Employees.Expand("Person").Where(emp => emp.BusinessEntityID < 100); } Here's the result: The Update-Button updates the number of vacation hours of the company's CEO. It fetches the Employee, updates its VacationHours property, then tells the state manager to update the employee's state, and eventually persists the data: private void Update_Click(object sender, RoutedEventArgs e) {     AdventureWorks2008Entities svc =         new AdventureWorks2008Entities(             new Uri("http://localhost:1544/WcfDataService.svc"));       Employee employee =         svc.Employees.Where(emp => emp.BusinessEntityID == 1).First();       employee.VacationHours++;       svc.UpdateObject(employee);     svc.SaveChanges(); } If you now repopulate the listbox, you will see the increased value: Source Code Here's the full source code of this sample (just requires VS2010 with no extra downloads): U2UConsult.WcfDataServices.Sample.zip (96,59 kb) Enjoy!

MEF in 300 seconds

The Managed Extensibility Framework (MEF) is the .NET framework for building applications that are extensible with external plugins after being deployed. I already hear you thinking: "D'oh! Not yet another Reflection slash Inversion of Control slash Composition framework ?" And yes, that's what it is. But here's the good news: it's easy to use, it fits in a single assembly, and it comes with good credentials and a proven track record. The assembly -System.ComponentModel.Composition- ships with the .NET framework 4.0 as well as with Silverlight 4.0. You can also download it from CodePlex together with source code and some flashy (but unfortunately also quite complex) examples. Terminology The MEF objectmodel is built around the following concepts: Part - a component that delivers or consumes services, Export - a service (class or interface) provided by a part, Import - a service (class or interface) consumed by a part, Contract - a formal description of imports and exports, Catalog - a means to publish and discover parts, and Composition - building the object graph by hooking up exporters to importers. The sample I built a small console app to demonstrate the core features. Here's how it looks like: It looks dull on the outside, doesn't it? I agree, but the beauty is within. This application actually composes its object graph dynamically -at run time- from assemblies that it discovers in a folder. Functionally, the console application hooks together electric devices (laptop, television, XBox) and electric power suppliers (nuclear power plants, wind parks, brown coal power stations). Technically, the solution looks like this. Except for the Contract assembly, there are NO references between the projects: Defining a Contract I started with creating an assembly -called Contracts- that describes the services. It's nothing more than a class library with some interface definitions. It will be referenced by all application components - host and parts. Here's the definition of the interfaces: namespace Contracts {     public interface IPowerSupplier     {         string EnergySource { get; }     } }   namespace Contracts {     public interface IPowerConsumer     {         string Start();           IPowerSupplier PowerSupplier { get; set; }     } }   Export: Exposing a Service The next step is to build some classes that implement these interfaces, and expose their services to MEF. A class indicates exposed services by an [Export] attribute. This is my representation of the well-known Springfield Nuclear Power Plant: namespace Springfield {     using System.ComponentModel.Composition; // MEF     using Contracts; // Power Supplier Contract       [Export(typeof(IPowerSupplier))]     public class NuclearPowerPlant : IPowerSupplier     {         public string EnergySource         {             get             {                 return "split atoms";             }         }     } }   Import: Requiring a Service A class indicates the services that it requires from MEF through the [Import] attribute. The following is a representation of my laptop. It exports itself as an IPowerConsumer, but it expects an IPowerSupplier from MEF: namespace My {     using System.ComponentModel.Composition; // MEF     using Contracts;       [Export(typeof(IPowerConsumer))]     public class Laptop : IPowerConsumer     {         [Import]         public IPowerSupplier PowerSupplier { get; set; }           public string Start()         {             return string.Format(                 "{0} {1} is writing blog entries thanks to the {2} from {3} {4}.",                 this.GetType().Namespace,                 this.GetType().Name,                 this.PowerSupplier.EnergySource,                 this.PowerSupplier.GetType().Namespace,                 this.PowerSupplier.GetType().Name);         }     } }   Hooking an importer to an exporter Let's now focus on the host application that will do the composition. It contains a PowerPlug class that directly hooks a Power Consumer to a Power Supplier. Through the [Import] attribute in its implementation, my Laptop already indicates to MEF that it requires a Supplier. So the PowerPlug itself only needs to inform MEF that it requires a Consumer: public class PowerPlug {     [Import(typeof(IPowerConsumer))]     public IPowerConsumer consumer { get; set; } }   PowerPlug's constructor looks up all assemblies in a specified directory, then loads the classes that are decorated with the mentioned attributes, and finally hooks them all up: public PowerPlug() {     var catalog = new DirectoryCatalog(@"../../Hooked");     var container = new CompositionContainer(catalog);     container.ComposeParts(this); }   All you need to do is instantiating the PowerPlug, and access the object graph: PowerPlug plug = new PowerPlug(); Console.WriteLine(string.Format(" {0}\n", plug.consumer.Start()));   Please observe that consumer ànd supplier were discovered and connected. Importing many Exports MEF allows you to import whole collections of instances that expose the same service. So I created another power supplier, the Thorntonbank Wind Farm near the belgian coast: namespace ThorntonBank {     using System.ComponentModel.Composition; // MEF     using Contracts; // Power Supplier Contract       [Export(typeof(IPowerSupplier))]     public class WindFarm : IPowerSupplier     {         public string EnergySource         {             get             {                 return "wind";             }         }     } } In the host application I created a DistributionOperator class that represents a power grid owner, like Elia. Instances of the class require a list of Suppliers from MEF. DistributionOperator exposes that need through the [ImportMany] attribute: [ImportMany] public IEnumerable<IPowerSupplier> Suppliers { get; set; } Its constructor is similar to the one from PowerPlug: it looks up all classes in a directory (please note that MEF is equipped with other container types). At run time, just instantiate, and the objectmodel will be available automagically: DistributionOperator elia = new DistributionOperator();   foreach (var supplier in elia.Suppliers) {     Console.WriteLine(         " {0} {1} provides electricity from {2}.",         supplier.GetType().Namespace,         supplier.GetType().Name,         supplier.EnergySource); }   Exposing Metadata Parts can expose metadata in the form of Key-Value pairs to MEF through the [ExportMetadata] attribute. The ThorntonBank WindFarm could indicate to MEF that it's a very Kyoto-friendly power source by decorating itself as follows: [ExportMetadata("EnergyType", "Renewable")] [Export(typeof(IPowerSupplier))] public class WindFarm : IPowerSupplier {...}   This works, but in the strongly-typed world it just makes more sense to define some helper classes in the Contracts-assembly: namespace Contracts {     public enum EnergyTypes     {         Renewable,         Questionable,         GloballyWarming     }       public class PowerSupplierMetadata     {         public const string EnergyType = "EnergyType";     } } This results in a safer way to expose the same information, but also one that's easier to consume by other components: [ExportMetadata(PowerSupplierMetadata.EnergyType, EnergyTypes.Renewable)] [Export(typeof(IPowerSupplier))] public class WindFarm : IPowerSupplier {     // ... }   Consuming Metadata MEF components can access the metadata, otherwise the concept would make no sense.  The MEF assembly contains the very useful (not MEF-specific) Lazy<T> class, together with (MEF-specific) Lazy<T, TMetadata>. The latter gives the framework access to the class' metadata without the need for instantiation. TMetadata exposes the keys in the metadata as read-only properties. So I needed to create yet another helper (the Contracts-assembly seemed a good place): public interface IPowerSupplierMetadata {     EnergyTypes EnergyType { get; } }   The DistributionOperator class can now be extended with a list of Lazy<IPowerSupplier, IPowerSupplierMetadata>: [ImportMany] private IEnumerable<Lazy<IPowerSupplier, IPowerSupplierMetadata>> SuppliersWithMetadata { get; set; }    The instances of this list get a MetaData property holding the metadata (obviously), and a Value property that gives access to the PowerSupplier instance. So the DistributionOperator could expose a list of green power suppliers like this: public IEnumerable<IPowerSupplier> KyotoFriendlySuppliers() {     var query = from s in this.SuppliersWithMetadata                 where s.Metadata.EnergyType.Equals(EnergyTypes.Renewable)                 select s.Value;     return query; }   Here's how the console app consumes the list: foreach (var supplier in elia.KyotoFriendlySuppliers()) {     Console.WriteLine(          " {0} {1} provides green electricity from {2}.",          supplier.GetType().Namespace,          supplier.GetType().Name,          supplier.EnergySource); }   What's next This is as good as it gets - at least in 300 seconds. If you're still interested in MEF, then make sure you don't skip this excellent article in MSDN Magazine. Source code Here's the full source code. It is based on preview 9 of MEF from CodePlex, and ye olde Visual Studio.NET 2008: U2UConsult.MEF.Sample.zip (2,52 mb) Enjoy!