Archive for the ‘ADO.NET’ Category

Localizing Linq to SQL Entities

August 18th, 2008 by Sidar Ok

Back from the holidays! Not getting too much sun certainly encourages to write code rather than chilling out. Writing on this subject was on my list as Linq to SQL got more mature, need for it in multi-cultural applications has arisen respectively. Also an old post of Ayende beat me to think about how a similar problem could be solved in Linq to SQL.

I’ll use the same model that he provided, and it is the following:

tmpD45

Figure 1. Table structure for multi-lingual products

 

As in the original post, the challenge is just to load the Product Names for the current culture(or specific culture), not all of them related to one product. So in nhibernate, there are filters to solve this problem in an elegant way. It is elegant because it is externally configurable and includes no intrusiveness in your design.

When internationalising-localizing comes into play, there are 2 main approaches from a Domain Perspective and it lies behind the answer of the question :

“Is localization a concern of my domain?”

In fairness, the answer changes for every domain (to my experience in most cases it is either no, or part of a different domain, such as administration). A simple way of determining if this is an issue is, to check that if domain needs to know about different cultures or domain elements with different languages need to talk to each other or not (Can Reuters publish news in Portugese ?). If the answer is yes, then even eager loading all language translations can be an option. But otherwise, we’ll need to abstract away so that domain won’t know about this infrastructurel concern.

In the original post, Ayende uses filters in NHibernate. In Linq to SQL we don’t have filters but as mentioned before, we have Load Options to give a criteria and reduce the amount of data we retrieve.

As a matter of fact, we expect following test to pass. Note that this is a state based test to test the data retrieved is not more than one.

   1: /// <summary>
   2: ///A test for GetProduct
   3: ///</summary>
   4: [TestMethod()]
   5: public void GetProductTest()
   6: {
   7:   ProductsRepository target = new ProductsRepository(); // TODO: Initialize to an appropriate value
   8:   int prodId = 1; // TODO: Initialize to an appropriate value
   9:   int lcId = 3; // TODO: Initialize to an appropriate value
  10:   Product actual = target.GetProduct(prodId, lcId);
  11:   Assert.AreEqual(“Prod13″, actual.Name);
  12:   Assert.IsTrue(actual.ProductNames.Count == 1);
  13: }

Where the records in the table are as follows:

image

Figure 2. Records in Product Names Table. As seen, there are 2 records for product id ‘1′

The entity structure that we have to use with Linq to SQL (generated by the courtesy of the designer) is as follows:

image

Figure 3. Object Model of Product and Product Name

Looks innocent doesn’t it ? The secret thing is that Product will always have a list of ProductNames, which in my case will always have 1 element. If I want to keep my domain ignorant of this, this certainly is a bad thing but this is what L2S gives me by default. There are ways to overcome this issue of course, but those are not the point of the post.

In addition to the model, I’ll add another field called “Name” to the model that’s not mapped to any column in db, to reach the same example. This is achieved by a partial class:

   1: partial class Product
   2: {
   3:     public string Name
   4:     {
   5:         get;
   6:         set;
   7:     }
   8: }

Now we are ready to write the code that passes the test. Note that we are utilizing AssociateWith Generic Method to make the necessary filtering.

   1: /// <summary>
   2: /// Gets the product for the current culture.
   3: /// </summary>
   4: /// <param name=”prodId”>The prod id.</param>
   5: /// <param name=”lcId”>The lc id to do localization filter.</param>
   6: /// <returns></returns>
   7: public Product GetProduct(int prodId, int? lcId)
   8: {
   9:     using (ProductsDataContext context = new ProductsDataContext())
  10:     {
  11:         // set load options if localizable filter needed
  12:         if (lcId.HasValue)
  13:         {
  14:             DataLoadOptions options = new DataLoadOptions();
  15:             options.AssociateWith<Product>(p => p.ProductNames.Where<ProductName>(pn => pn.CultureId == lcId));
  16:             context.LoadOptions = options;
  17:         }
  18:
  19:         Product pFromDb = context.Products.Single<Product>(p => p.ProductId == prodId);
  20:
  21:         return new Product()
  22:         {
  23:                  Amount = pFromDb.Amount,
  24:                  ProductId = pFromDb.ProductId,
  25:                  Size = pFromDb.Size,
  26:                  Name = pFromDb.ProductNames.First<ProductName>().Name,
  27:                  ProductNames = pFromDb.ProductNames
  28:         };
  29:      }
  30: }

Now since we are done with the original post, let’s go beyond the bar and implement inserts & updates too. With Inserts, there are 2 things that I am going to handle: 1 - It is a brand new insert 2 - It is just an insert of a new product name in another language.

For first one here is the test :

   1: /// <summary>
   2: ///A test for InsertProduct
   3: ///</summary>
   4: [TestMethod()]
   5: public void Should_Insert_for_Completely_New_Prod()
   6: {
   7:     ProductsRepository target = new ProductsRepository(); // TODO: Initialize to an appropriate value
   8:     Product p = new Product()
   9:     {
  10:          Amount = 31,
  11:          Name = “English Name”,
  12:              ProductId = 0,
  13:              Size = 36,
  14:     };
  15:     int lcId = 7;
  16:     using (TransactionScope scope = new TransactionScope(TransactionScopeOption.Suppress))
  17:     {
  18:         target.InsertProduct(p, lcId);
  19:         Assert.IsTrue(p.ProductId > 0);
  20:         Assert.IsTrue(p.ProductNames.Count > 0);
  21:      }
  22: }

And for the second one:

   1: /// <summary>
   2: ///A test for InsertProduct
   3: ///</summary>
   4: [TestMethod()]
   5: public void Should_Insert_Name_for_Existing_Prod()
   6: {
   7:     ProductsRepository target = new ProductsRepository(); // TODO: Initialize to an appropriate value
   8:     Product p = target.GetProduct(1);
   9:     int firstCount = p.ProductNames.Count;
  10:     p.Name = “Kurdish Name”;
  11:     int lcId = 9;
  12:     using (TransactionScope scope = new TransactionScope(TransactionScopeOption.Suppress))
  13:     {
  14:         target.InsertProduct(p, lcId);
  15:         Product prAfterInsert = target.GetProduct(p.ProductId);
  16:         Assert.AreEqual(firstCount + 1, prAfterInsert.ProductNames.Count);
  17:     }
  18: }

So, passing test is obvious. I need to do an extra insert to the product tables if it is a new one, and that’s it:

   1: /// <summary>
   2: /// Inserts the product.
   3: /// </summary>
   4: /// <param name=”p”>The p.</param>
   5: /// <param name=”lcId”>The lc id.</param>
   6: public void InsertProduct(Product p, int lcId)
   7: {
   8:     using (ProductsDataContext context = new ProductsDataContext())
   9:     {
  10:         if (p.ProductId == 0)
  11:         {
  12:             // insert only if it is new
  13:             context.Products.InsertOnSubmit(p);
  14:         }
  15:
  16:         InsertProductNameForProduct(context, p, lcId);
  17:         context.SubmitChanges();
  18:     }
  19: }
  20:
  21: /// <summary>
  22: /// Inserts the product name for product.
  23: /// </summary>
  24: /// <param name=”context”>The context.</param>
  25: /// <param name=”p”>The p.</param>
  26: /// <param name=”lcId”>The lc id.</param>
  27: private void InsertProductNameForProduct(ProductsDataContext context, Product p, int lcId)
  28: {
  29:     context.ProductNames.InsertOnSubmit(new ProductName()
  30:     {
  31:         CultureId = lcId,
  32:         Name = p.Name,
  33:         ProductId = p.ProductId,
  34:         Product = p,
  35:      });
  36: }

And last, for update; apart from the obvious part there is one situation we need to handle : if the name of the product is changed, than we need to update it as well. For the other fields, go on with the regular update. Here is the test that codifies the statement:

   1: /// <summary>
   2: ///A test for UpdateProduct
   3: ///</summary>
   4: [TestMethod()]
   5: public void should_update_product_and_its_current_name()
   6: {
   7:     ProductsRepository target = new ProductsRepository(); // TODO: Initialize to an appropriate value
   8:     Product p = target.GetProduct(1, 2);
   9:     p.Name = “French Name”;
  10:     p.Amount = 40;
  11:     p.Size = 55;
  12:     using (TransactionScope scope = new TransactionScope(TransactionScopeOption.Suppress))
  13:     {
  14:         target.UpdateProduct(p);
  15:         Assert.AreEqual(“French Name”, p.Name);
  16:         Assert.AreEqual(40, p.Amount);
  17:         Assert.AreEqual(55, p.Size);
  18:      }
  19: }

After writing the test, the implementation below becomes obvious:

   1: public void UpdateProduct(Product p)
   2: {
   3:    // since we don’t load more than one product name, we can assume that the one is updated
   4:    using (ProductsDataContext context = new ProductsDataContext())
   5:    {
   6:        context.Products.Attach(p, true);
   7:        ProductName currentName = p.ProductNames.Single<ProductName>();
   8:        if (p.Name != currentName.Name)
   9:        {
  10:            // it is updated, update it
  11:            currentName.Name = p.Name;
  12:         }
  13:         context.SubmitChanges();
  14:     }
  15: }

I showed a possible strategy to localize Linq to SQL entities in this post. Of course, more complex scenarios such as child entities and lazy loading issues could be thought thoroughly, but I hope this gave some initiative to attack the whole idea.

Comments and critics well appreciated as always.

kick it on DotNetKicks.com

Share it on: These icons link to social bookmarking sites where readers can share and discover new web pages.
  • Digg
  • Sphinn
  • del.icio.us
  • Facebook
  • Mixx
  • Google
  • Blogosphere News
  • e-mail
  • YahooMyWeb
  • DotNetKicks
  • DZone

10 Tips to Improve your LINQ to SQL Application Performance

May 2nd, 2008 by Sidar Ok

Hey there, back again. In my first post about LINQ I tried to provide a brief(okay, bit detailed) introduction for those who want to get involved with LINQ to SQL. In that post I promised to write about a basic integration of WCF and LINQ to SQL working together, but this is not that post.

Since LINQ to SQL is a code generator and an ORM and it offers a lot of things, it is normal to be suspicious about performance of it. These are right up to a certain point as LINQ comes with its own penalties. But there are several benchmarks showing that DLINQ brings us up to %93 of the ADO.NET SQL DataReader performance if optimizations are done correctly.

Hence I summed up 10 important points for me that needs to be considered during tuning your LINQ to SQL’s data retrieval and data modifying process:

1 – Turn off ObjectTrackingEnabled Property of Data Context If Not Necessary

If you are trying only to retrieve data as read only, and not modifying anything, you don’t need object tracking. So turn it off using it like in the example below:

using (NorthwindDataContext context = new NorthwindDataContext())
{
  context.ObjectTrackingEnabled = false;
}

This will allow you to turn off the unnecessary identity management of the objects – hence Data Context will not have to store them because it will be sure that there will be no change statements to generate.

2 – Do NOT Dump All Your DB Objects into One Single DataContext

DataContext represents a single unit of work, not all your database. If you have several database objects that are not connected, or they are not used at all (log tables, objects used by batch processes,etc..). These objects just unnecessarily consume space in the memory hence increasing the identity management and object tracking costs in CUD engine of the DataContext.

Instead think of separating your workspace into several DataContexts where each one represents a single unit of work associated with it. You can still configure them to use the same connection via its constructors to not to loose the benefit of connection pooling.

3 – Use CompiledQuery Wherever Needed

When creating and executing your query, there are several steps for generating the appropriate SQL from the expression, just to name some important of them:

  1. Create expression tree

  2. Convert it to SQL

  3. Run the query

  4. Retrieve the data

  5. Convert it to the objects

As you may notice, when you are using the same query over and over, hence first and second steps are just wasting time. This is where this tiny class in System.Data.Linq namespace achieves a lot. With CompiledQuery, you compile your query once and store it somewhere for later usage. This is achieved by static CompiledQuery.Compile method.

Below is a Code Snippet for an example usage:

Func<NorthwindDataContext, IEnumerable<Category>> func =
   CompiledQuery.Compile<NorthwindDataContext, IEnumerable<Category>>
   ((NorthwindDataContext context) => context.Categories.
      Where<Category>(cat => cat.Products.Count > 5));


And now, “func” is my compiled query. It will only be compiled once when it is first run. We can now store it in a static utility class as follows :

/// <summary>
/// Utility class to store compiled queries
/// </summary>
public static class QueriesUtility
{
  /// <summary>
  /// Gets the query that returns categories with more than five products.
  /// </summary>
  /// <value>The query containing categories with more than five products.</value>
  public static Func<NorthwindDataContext, IEnumerable<Category>>
    GetCategoriesWithMoreThanFiveProducts
    {
      get
      {
        Func<NorthwindDataContext, IEnumerable<Category>> func =
          CompiledQuery.Compile<NorthwindDataContext, IEnumerable<Category>>
          ((NorthwindDataContext context) => context.Categories.
            Where<Category>(cat => cat.Products.Count > 5));
        return func;
      }
    }
}

And we can use this compiled query (since it is now a nothing but a strongly typed function for us) very easily as follows:

using (NorthwindDataContext context = new NorthwindDataContext())
{
  QueriesUtility.GetCategoriesWithMoreThanFiveProducts(context);
}

Storing and using it in this way also reduces the cost of doing a virtual call that’s done each time you access the collection – actually it is decreased to 1 call. If you don’t call the query don’t worry about compilation too, since it will be compiled whenever the query is first executed.

4 – Filter Data Down to What You Need Using DataLoadOptions.AssociateWith

When we retrieve data with Load or LoadWith we are assuming that we want to retrieve all the associated data those are bound with the primary key (and object id). But in most cases we likely need additional filtering to this. Here is where DataLoadOptions.AssociateWith generic method comes very handy. This method takes the criteria to load the data as a parameter and applies it to the query – so you get only the data that you need.

The following code below associates and retrieves the categories only with continuing products:

using (NorthwindDataContext context = new NorthwindDataContext())
{
  DataLoadOptions options = new DataLoadOptions();
  options.AssociateWith<Category>(cat=> cat.Products.Where<Product>(prod => !prod.Discontinued));
  context.LoadOptions = options;
}

5 – Turn Optimistic Concurrency Off Unless You Need It

LINQ to SQL comes with out of the box Optimistic Concurrency support with SQL timestamp columns which are mapped to Binary type. You can turn this feature on and off in both mapping file and attributes for the properties. If your application can afford running on “last update wins” basis, then doing an extra update check is just a waste.

UpdateCheck.Never is used to turn optimistic concurrency off in LINQ to SQL.

Here is an example of turning optimistic concurrency off implemented as attribute level mapping:

[Column(Storage=“_Description”, DbType=“NText”,
            UpdateCheck=UpdateCheck.Never)]
public string Description
{
  get
  {
    return this._Description;
  }
  set
  {
    if ((this._Description != value))
    {
      this.OnDescriptionChanging(value);
      this.SendPropertyChanging();
      this._Description = value;
      this.SendPropertyChanged(“Description”);
      this.OnDescriptionChanged();
    }
  }
}

6 – Constantly Monitor Queries Generated by the DataContext and Analyze the Data You Retrieve

As your query is generated on the fly, there is this possibility that you may not be aware of additional columns or extra data that is retrieved behind the scenes. Use Data Context’s Log property to be able to see what SQL are being run by the Data Context. An example is as follows:

using (NorthwindDataContext context = new NorthwindDataContext())
{
  context.Log = Console.Out;
}


Using this snippet while debugging you can see the generated SQL statements in the Output Window in Visual Studio and spot performance leaks by analyzing them. Don’t forget to comment that line out for production systems as it may create a bit of an overhead. (Wouldn’t it be great if this was configurable in the config file?)

To see your DLINQ expressions in a SQL statement manner one can use SQL Query Visualizer which needs to be installed separately from Visual Studio 2008.

7 – Avoid Unnecessary Attaches to Tables in the Context

Since Object Tracking is a great mechanism, nothing comes for free. When you  Attach an object to your context, you mean that this object was disconnected for a while and now you now want to get it back in the game. DataContext then marks it as an object that potentially will change - and this is just fine when you really intent to do that.

But there might be some circumstances that aren’t very obvious, and may lead you to attach objects that arent changed. One of such cases is doing an AttachAll for collections and not checking if the object is changed or not. For a better performance, you should check that if you are attaching ONLY the objects in the collection those are changed.

I will provide a sample code for this soon.

8 – Be Careful of Entity Identity Management Overhead

During working with a non-read only context, the objects are still being tracked – so be aware that non intuitive scenarios this can cause while you proceed. Consider the following DLINQ code:

using (NorthwindDataContext context = new NorthwindDataContext())
{
  var a = from c in context.Categories
  select c;
}

Very plain, basic DLINQ isn’t it? That’s true; there doesn’t seem any bad thing in the above code. Now let’s see the code below:

using (NorthwindDataContext context = new NorthwindDataContext())
{
  var a = from c in context.Categories
  select new Category
  {
    CategoryID = c.CategoryID,
    CategoryName = c.CategoryName,
    Description = c.Description
  };
}

The intuition is to expect that the second query will work slower than the first one, which is WRONG. It is actually much faster than the first one.

The reason for this is in the first query, for each row the objects need to be stored, since there is a possibility that you still can change them. But in the 2nd one, you are throwing that object away and creating a new one, which is more efficient.

9 – Retrieve Only the Number of Records You Need

When you are binding to a data grid, and doing paging – consider the easy to use methods that LINQ to SQL provides. These are mainly Take and Skip methods. The code snippet involves a method which retrieves enough products for a ListView with paging enabled:

/// <summary>
/// Gets the products page by page.
/// </summary>
/// <param name=”startingPageIndex”>Index of the starting page.</param>
/// <param name=”pageSize”>Size of the page.</param>
/// <returns>The list of products in the specified page</returns>
private IList<Product> GetProducts(int startingPageIndex, int pageSize)
{
  using (NorthwindDataContext context = new NorthwindDataContext())
  {
    return context.Products
           .Take<Product>(pageSize)
           .Skip<Product>(startingPageIndex * pageSize)
           .ToList<Product>();
   }
}

10 – Don’t Misuse CompiledQuery

I can hear you saying “What? Are you kiddin’ me? How can such a class like this be misused?”

Well, as it applies to all optimization LINQ to SQL is no exception:

“Premature optimization is root all of evil” – Donald Knuth

If you are using CompiledQuery make sure that you are using it more than once as it is more costly than normal querying for the first time. But why?

That’s because the resulting function coming as a CompiledQuery is an object, having the SQL statement and the delegate to apply it. It is not compiled like the way regular expressions are compiled. And your delegate has the ability to replace the variables (or parameters) in the resulting query.

That’s the end folks, I hope you’ll enjoy these tips while programming with LINQ to SQL. Any comments or questions via sidarok at sidarok dot com or here to this post are welcome.

kick it on DotNetKicks.com

Technorati Tags: LINQ,SQL,Performance,.NET 3.5


Share it on: These icons link to social bookmarking sites where readers can share and discover new web pages.
  • Digg
  • Sphinn
  • del.icio.us
  • Facebook
  • Mixx
  • Google
  • Blogosphere News
  • e-mail
  • YahooMyWeb
  • DotNetKicks
  • DZone

A Brief Introduction to LINQ to SQL

April 21st, 2008 by Sidar Ok

Introduction to LINQ to SQL

I know there are a lot of LINQ 2 SQL introductions, and you already are sick about them. So I will keep this section as brief as possible.

Over the years the community kept fancying strongly typed objects rather then not OO friendly Datatables and Datasets, Microsoft kept pushing them to go in that way. Since there are understandable reasons, (such as performance) this didn’t change the fact that in an object oriented world, whose requirements are already getting more complex and complex. Typed Datasets couldn’t be response to these requirements even it was making things easier to manage.

This approach also had a consequence of ignoring multi tier and professional real world applications (It would be more difficult to support these scenarios in a 5 minutes of drag & drop presentation!)

Lots of companies(such as CodeSmith)had seen an opportunity here, and they were damn right, so they made lots of money out of that.

So What is Linq 2 SQL and Does it fit in this picture ?

Linq 2 SQL is both an ORM and Code Generator. Although (mostly java guys) humiliate it, I see it as a great step forward for .NET environment, and has a lot of nice features. It speeds up the data managing & generating entities process (no tricks this time, really drag & drop & tweak and it works). And of course it has lots of cons which I will talk about in the later posts.

Following gives an idea where Linq 2 SQL resides in all this LINQ 2 X family :

clip_image001

                                            Figure 1.1 Linq Architecture Overview

Linq 2 SQL is coming with very easy to use tools to ease the code generation. And the generated code is surprisingly good, and using the new .NET framework 3.5 features, such as automatic properties, extension methods and lambda expressions. It is full object oriented and provides a very good level of abstraction.

Scott Guthrie has explained a while ago how to use designer in several blog posts and a very nice video . So be calm, I wont go down in that way J .

The good thing about designer is it picks up about everything that you defined in your db, tables, functions, stored procedures, primary-foregin keys and other constraints etc. It builds the object structure and relations between them based on these, and puts it into a DataContext that you specified before. You can manage your own DataContext and inherit from your DataContextBase for your architectural and business needs again, the designer tool is friendly to this approach.

To understand how LINQ 2 SQL works, understanding DataContext is essential. DataContext itself has lots of good features.

Being Aware of Context

There are 4 DataContext Constructors lets you create a data context. When you create a DataContext of your choice through designer, it creates and associates a connectionstring in settings file or app.config if it exists. Beware that if you want to replace this behaviour in configuration file (a common practice it to put it to web.config for web applications for instance) you need to make sure that a connection string with the same name exists within the current context.

But if you want to do it programmatically, you need to create the DataContext with an IDbConnection .

Understanding How DataContext Works

If I say that the LINQ 2 SQL world revolves around DataContext, I wouldn’t be much exaggerating. Our aim is to do basic insert, update and delete operations as quick & efficient as possible, so we need to understand that DatContext is the gateway to perform this goal.

DataContext is pretty flexible and enables you to work directly with objects or run your own queries / stored procedures or user functions.

1 – Working with the Objects

A cool way of using DataContext is making it to generate Insert, Update and Delete SQL statements against your data model. This works pretty well if you really know what you are dealing with. After generating your context via designer or SqlMetal code generator tool, you have your DataContext with your Entities generated and attached to it. Below is how Northwind Data Context looks like after a generation is performed :

clip_image003

                                                   Picture 1 : Northwind Entities

As you see categories table is mapped to Categories Generic Table that consists of a Category Type that’s mapped to each row.In this table class, there is a set of extension methods that you can use (we will in later posts) to construct SQL statements in a better object oriented way.

However, as you would expect, the relations between types shouldn’t be Table object or something that reminds db to us. EntitySet and EntityRef is there for this purpose. EntitySet in the context. EntitySet is a generic set of entities specified, Category in our case. It implements IList interface so there is no harm to say this is basically a list and we can use every method, including extension methods that a Generic IList can benefit.

If the relationship is one-to-one between the tables then an EntityRef is created for that entity.

Just to keep in mind that these two are NOT serializable, we are going to discuss this issue in later posts.

There are 2 cool features of data context that we need to know here: Object Tracking and Deferred (or Lazy) Loading.

- Object Tracking

This is the change tracking system that Linq to SQL provides. If you want your queries to be generated on the fly automatically, this is the system that provides it. However, if you only do want to select and perform read only operations and don’t want to track any changes, disabling it will improve the retrieval performance.

To disable it, you need to set ObjectTrackingEnabled property of your context to false. The default is true.

The golden rule working with object tracking is : “Data Context needs to be aware of every insert, update and delete to generate appropriate sql statement”. You need to tell DataContext what objects to insert, what to delete and what to update.

You can work with a DataContext that has ObjectTrackingEnabled in 2 ways : Connected and Disconnected.

In connected mode, everything is easy, and world is a very nice place to live. You add, delete, update from/on context and these are all trivial operations: You just manage the collection, add, update or remove an object, and call the magical context.SubmitChanges() method, and that’s it. Your statements are generated and executed. Everyone is happy.

But things aren’t as simple and this is usually not the case. In a service oriented, n-tier world, or long running sessions, keeping DataContext alive is an unreasonable hope. So in disconnected mode you need to tell explicitly what you need to insert, update and delete.

You do this with a bunch of method provided in the generic table that is associated . I will try to show how to use these methods in action in another post but to name them here :

For Insertions => InsertOnSubmit for a single entity, InsertAllOnSubmit for collections.

For Deletions => DeleteOnSubmit for a single entity and DeleteAllOnSubmit for collections.

For Updates => You need to use Attach for a single entity and AttachAll for collections. You’ll see that Attach methods have 2 overloads, one with the entity and the other one with asModified boolean. Setting this to true means that your entity will be included in the generation process even if you made any changes on that.

The way only way of informing that a disconnected object had been changed is through Attach methods. For instance, If you want to delete an object that is not in the context, you first need to attach it to the context. Otherwise, your delete will fail with this message

System.Data.Linq.ChangeConflictException: Row not found or changed

We are doing these to inform context about object tracking states. Each method changes the object’s state to a corresponding value to be able to generate the appropriate query at the end.

- Lazy Loading (Deferred Loading)

In LINQ, if you query for an object, you get the object that you requested. Nothing else, no child relations or back references are loaded. This is achieved because DeferredLoadingEnabled is true by default. You can set this to false if you want to disable it, but that is usually not the best thing to do. We usually need to customize what to load and what not to load.

In this sense, there is a property called LoadOptions of type DataLoadOptions in System.Data.Linq namespace.

The code below illustrates an example usage of this.

1:  private IQueryable<Category> GetDescribedCategoriesWithPicturesAndProducts()
2:  {
3:     using (NorthwindDataContext context = new NorthwindDataContext ())
4:     {
5:         System.Data.Linq.DataLoadOptions options = new System.Data.Linq.DataLoadOptions();
6:         options.LoadWith<Category>(ct => ct.Picture);
7:         options.LoadWith<Category>(ct => ct.Products); // first level
8:         options.LoadWith<Product>(p => p.Supplier);
9:         options.LoadWith<Product>(p => p.Order_Details); // second level
10:   
11:         return context.Categories.Where<Category>(ct => !string.IsNullOrEmpty(ct.Description));
12:      }
13:  }

As you already noticed, you can submit a request to load second or Nth level loading (I would appreciate if you let me know if you know the upper limit of this N). One good thing to keep in mind here is to avoid cyclic loads.

2 – Running Plain Text-Based SQL Queries

One other usage of LINQ is to use context as a gateway to run your predefined sql statements.

a) ExecuteCommand

You can call DataContext.ExecuteCommand to achieve this.

1:  using (NorthwindDataContext context = new NorthwindDataContext())
2:  {
3:     int rowCount = context.ExecuteCommand(“SELECT * FROM CATEGORIES”);
4:  }

b) ExecuteQuery

You can call ExecuteQuery generic method and map the results to an entity of your choice as follows.

1:  using (NorthwindDataContext context = new NorthwindDataContext())
2:  {
3:    IEnumerable<Category> categories =   

context.ExecuteQuery<Category>(“SELECT * FROM CATEGORIES”);
4:  }

This was the end of the introduction. Hope it helped you a bit to figure the concepts out and get started. In later posts I will try to get into more advanced topics and create a real world application using LINQ in a multi tiered environment in conjunction with WCF.
kick it on DotNetKicks.com

Technorati Tags: ,,,


Share it on: These icons link to social bookmarking sites where readers can share and discover new web pages.
  • Digg
  • Sphinn
  • del.icio.us
  • Facebook
  • Mixx
  • Google
  • Blogosphere News
  • e-mail
  • YahooMyWeb
  • DotNetKicks
  • DZone