latest posts

Continuing my work deep diving into ASP.NET 5 (vNext), I started going down the path of EntityFramework 7. Which similiarly to ASP.NET 5, is like a reboot of the framework itself. Readers interested to dive in, I highly suggest you watch the MVA Video called What's New with ASP.NET 5 that goes over all of the changes in pretty good detail (though I have a running questions list to ask at BUILD in a few weeks).

Noting that EntityFramework 7 beta was included in my ASP.NET 5 project, I hit a road block into finding it through the usual method in the NuGet Package Manager. As of this writing, only 6.1.3 was available. In looking around, the answer is to add another NuGet Package Source. I had done this previously as I setup a private NuGet Package Server at work to host common libraries used throughout all of our projects. For those unaware, goto Tools->NuGet Package Manager->Package Manager Settings.

Once there, click on Package Sources and then the + icon, enter a descriptive name and for the source and paste the following url: and click Update. After you're done, you should have something similiar to this:

You can now close out that window and return to the NuGet Package Manager and upon switching the Package Source dropdown to be ASP.NET vNext (or whatever you called it in the previous screen) you should now see EntityFramework 7 (among other pre-release packages) as shown below.

Hopefully that helps someone out there wanting to deep dive into EntityFramework 7.

For as long as I can remember since C# became my language of choice, I've been yearning for the cleanest and most efficient way of getting data from a database (XML, SQL Server etc.) into my C# application whether it was in an ASP.NET, WinForms or Windows Service. For the most part I was content with Typed DataSets back in the .NET 2.0 days. Creating an XML file by hand with all of the different properties, running the command line xsd tool on the XML file and having it generate a C# class I could use in my WinForms application. This had problems later on down the road when Portable Class Libraries (PCLs) became available, eliminating code duplication, but lack to this day Typed Dataset support, not to mention Client<->Web Service interactions have changed greatly since then switching to a mostly JSON REST infrastructure.

Ideally I wish there was a clean way to define a Class inside a Portable Class Library (made available to Android, iOS, Windows Store, Windows Phone and .NET 4.5+) and have Entity Framework map to those entities. Does this exist today? To the best of my research it does not without a lot of work upfront (more on this later). So where does this lead us to today?

For most projects you probably see a simple Entity Framework Model mapped to SQL Tables, Views and possibly Stored Procedures living inside the ASP.NET WebForms or MVC solution. While this might have been acceptable 4 or 5 years ago, in the multi-platform world we live in today you can't assume you'd only have a Web only client. As I've stated on numerous times over the years, investing a little bit more time in the initial development of a project to plan ahead for multiple platforms is key today. Some of you might be saying "Well I know it'll only be a Web project, the client said they'd never want a native mobile app when we asked them" Now ask yourself how many times a client came back and asked for what they said they never wanted when you or the PM asked. Planning ahead not only saves time later, but delivers a better product for your client (internal or external), bringing a better value add to your service.

Going back to the problem at hand: a highly coupled Entity Framework model to the rest of the ASP.NET application. Below are some possible solutions (not all of them, but in my opinion the most common:

1. Easy way out

Some projects I have worked on had the Entity Framework model (and associated code) in their own Library and then the ASP.NET (WebForms, MVC, WebAPI or WCF service) project simply reference the library. While this is better in that if  you migrate from the existing project or want a completely different project to reference the same model (a Windows Service perhaps), then you don't have to invest the time in moving all of the code and updating all of the namespaces in both the new library and the project(s) referencing it. However you still have the tight coupling between your project and the Entity Framework model.

2. POCO via Generators

Another possible solution is to use the POCO (Plain Old CLR Object) approach with Entity Framework. There are a number of generators (Entity Framework Power Tools or the EntityFramework Reverse POCO Generator), both have dependencies in  the clients that reference the POCO Classes with Entity Framework, thus negating the idea you'd be able to have 1 set of classes for both your clients of your platform and Entity Framework.

3. POCO with Reflection

Yet another possible solution is to create a custom attribute and via reflection map a class object defined in your PCL. This approach has the cleaness of having the following possible POCO Class with custom attributes:
[POCOClass] [DataContract] public class UserListingResponseItem {
     [DataMember] [POCOMemeber("ID")] public int UserID {
     get; set; }
[DataMember] [POCOMemeber("Username")] public string Username {
     get; set; }
[DataMember] [POCOMemeber("FirstName")] public string FirstName {
     get; set; }
[DataMember] [POCOMemeber("LastName")] public string LastName {
     get; set; }
The problem with this solution is that as any seasoned C# developer knows, reflection is extremely slow. If performance wasn't an issue (very unlikely) then this could be possible solution.

4. DbContext to the rescue

In doing some research on POCO with Entity Framework I came across one approach in which you can retain your existing Model untouched, but then define a new class inheriting from DbContext like so:
public class TestModelPocoEntities : DbContext {
     public DbSet Users {
     get; set; }
protected override void OnModelCreating(DbModelBuilder modelBuilder) {
     // Configure Code First to ignore PluralizingTableName convention // If you keep this convention then the generated tables will have pluralized names. modelBuilder.Conventions.Remove(); modelBuilder.Entity().ToTable("Users"); modelBuilder.Entity().Property(t => t.UserID).HasColumnName("ID"); modelBuilder.Entity().HasKey(t => t.UserID); }
What this code block does is map the Users table to a POCO Class called UserListingResponseItem (the same definition as above). By doing so you can then in your code do the following:
using (var entity = new TestModelPocoEntities()) {
     return entity.Users.ToList(); }
As one can see this is extremely clean on the implementation side, albiet a bit tideous on the backend side. Imagining a recent project at work with hundreds of tables this could be extremely daunting to maintain, let alone implement in an sizeable existing project.

Unsatisfied with these options I was curious how a traditional approach would compare performance wise to Option 4 above, given that it satisfied the requirement of a single class residing in a PCL. For comparison assuming the table is  defined as such:

Data Translation User Table

The "traditional" approach:
using (var entity = new Entities.testEntities()) {
     var results = entity.Users.ToList(); return results.Select(a => new UserListingResponseItem {
     FirstName = a.FirstName, LastName = a.LastName, Username = a.Username, UserID = a.ID }
).ToList(); }
Returns a List of Users EntityFramework objects and then iterates over every item and sets the equivalent property in the UserListingResponseItem Class before returning the result.

The Benchmark

For the benchmark I started with the MVC Base Template in Visual Studio 2015 Preview, removed all the extra Views, Controllers and Models and implemented a basic UI for testing:

Data Translation Base UI

A simple population of random data for the Users table and deletion of records before each test run:
private void createTestData(int numUsersToCreate) {
     using (var entity = new Entities.testEntities()) {
     entity.Database.ExecuteSqlCommand("DELETE FROM dbo.Users"); for (var x = 0; x < numUsersToCreate; x++) {
     var user = entity.Users.Create(); user.Active = true; user.Modified = DateTimeOffset.Now; user.Password = Guid.Empty; user.LastName = (x%2 == 0 ? (x*x).ToString() : x.ToString()); user.FirstName = (x%2 != 0 ? (x * x).ToString() : x.ToString()); user.Username = x.ToString(); entity.Users.Add(user); entity.SaveChanges(); }

Benchmark Results

Below are the results running the test 3 times for each size data set. For those that are interested I was running the benchmark on my AMD FX-8350 (8x4ghz), VS 2013 Update 4 and SQL Server 2014 with the database installed on a Samsung 840 Pro SSD.

Data Translation Performance Results

The results weren't too surprising to me figuring the "traditional" approach would be a factor or so slower than the DbContext approach, but I didn't think about it from the standpoint of larger datasets being considerably slower. Granted we're talking fractions of a second, but multiple that by hundreds of thousands (or millions) of concurrent connections it is considerable.

Closing thoughts

Having spent a couple hours deep diving into the newer features of Entity Framework 6.x hoping that the golden solution would exist today I'm having to go back to an idea I had several months ago, jcENTITYFRAMEWORK in which at compile time the associations would be created mapping the existing classes to the equivalent Tables, Views and Stored Procedures. In addition to utilizing the lower level ADO.NET calls instead of simply making EntityFramework calls. Where I left it off I was still hitting an ASP.NET performance hit on smaller data sets (though on larger data sets my implementation was several factors better). More to come on that project in the coming weeks/months as Database I/O with C# is definitely not going away for anyone and there is clearly a problem with the possible solutions today. At the very least coming up with a clean and portable way to allow existing POCOs to be mapped to SQL Tables and Stored Procedures is a new priority for myself.

For those interested in the ASP.NET MVC and PCL code used in benchmarking the two approaches, you can download it here. Not a definitive test, but real world enough. If for some reason I missed a possible approach, please comment below, I am very eager to see a solution.
A little less than 2 months ago I had some crazy ideas for interacting with a SQL database and C# as opposed to simply using ADO.Net or Microsoft's own Entity Framework. Not sure exactly as to how I was going to implement some of the features, I shelved it until I came up with a clean way to implement it.

With this project I had four goals:
1. Same or similar syntax to Entity Framework - meaning I should be able to simply drop in my framework in place of Entity Framework with little to no changes.
2. Performance should be equal to or better in both console and WebAPI applications - covering both scenarios of desktop applications and normal for today, WebAPI Services returning results and executing SQL server side and then returning results to a client.
3. Implement my own caching syntax that puts the effort of caching on the Framework, not the user of the Framework.
4. Provide an easy way to generate strongly typed classes akin to Microsoft's Entity Framwork. This weekend I was able to achieve #1 and to some degree #2.

I was able to achieve an identical syntax to Entity Framework like in the snippet below:
using (var jFactory = new jcEntityFactory()) {
     jFactory.JCEF_ExecuteTestSP(); }
In regards to performance, I wrote 2 tests. One that simply called a stored procedure with a single insert statement and another that returned several thousand rows. To give some what real results I directly referenced the framework in a console application and then wrote a WebAPI Service referencing the framework along with a wrapper function to call the WebAPI Service from a console application.

Without further adieu here are the results running it with 10 to 1000 iterations:
[bash] Console App Tests JC EF 10 Iterations with average of 0.00530009 MS EF 10 Iterations with average of 0.05189771 WebAPI Tests JC EF 10 Iterations with average of 0.18459302 MS EF 10 Iterations with average of 0.12075582 Console App Tests JC EF 100 Iterations with average of 0.000740188 MS EF 100 Iterations with average of 0.005783375 WebAPI Tests JC EF 100 Iterations with average of 0.018184102 MS EF 100 Iterations with average of 0.011673686 Console App Tests JC EF 1000 Iterations with average of 0.0002790646 MS EF 1000 Iterations with average of 0.001455153 WebAPI Tests JC EF 1000 Iterations with average of 0.0017801566 MS EF 1000 Iterations with average of 0.0011440657 [/bash] An interesting note is the WebAPI performance differences between the console application. Sadly, with a WebAPI Service my framework is nearly twice as slow, but in console applications (presumably WinForms and WPF as well) my framework was considerably faster.

So where does that leave the future of the framework? First off, I am going to investigate further on the performance discrepencies between the two approaches. Secondly, I am going to then add in caching support with the following syntax (assuming one would want to cache a query result for 3 hours):
using (var jFactory = new jcEntityFactory()) {
     jFactory.Cache(JCEF_ExecuteTestSP(), HOURS, 3); }
More to come with my framework as it progresses over the next few weeks. As far as a release schedule, once all four of my main project requirements are completed I will release a pre-release version on NuGet. I don't plan on open-sourcing the framework, but that may change further down the road. One thing is for sure, it will be freely available through NuGet.