latest posts

As some may have noticed, my site has undergone a slight revamp. Since going live with my ASP.NET MVC 4 based blog in April 2013 I had making the site responsive from a mobile phone to desktop one of my top priorities when feeling the desire to do web development outside of work. A project at work on the horizon in a few months from now demanded I invest the time in my off-hours to get 100% comfortable with the latest techniques. Along with the responsive design, I redid the routing to take advantage of MVC 5's Attribute based routing and did away with the WCF Service that had been the back bone of my blog for nearly 2 years. The idea being now that you can have MVC and WebAPI services hosted in one solution (ideal for my blog - still not convinced that's a good seperation of concerns for larger enterprise projects), there is no reason for my blog to have 2 seperate solutions.

Along those same lines, my recent presentation at the Blue Ocean Competition on Xamarin Forms and C# in general made me turn a new leaf in regards to my side-projects. Starting a week or so ago, every day I've been checking in the complete source code to a project from my private Subversion repository I've kept for 8 years now. My thought process is that if 1 person finds even 1 thing they didn't know or could use, that's 1 more person who got use out of it than it simply sitting in my SVN repository until I went back around to work on it. As anyone whose followed my blog for any period of time knows I pick up a project, work on it for a while and then pick it back up a few months (or years) later.

That being said, in the coming weeks the platform that powers my blog, bbXP will also be made available on my Git Hub account. There's some work involved to get it to a more generic place along with cleaning up some of the code now that I've got another 2 years of MVC development under my belt.

Lastly, content wise I finally cleaned up my White Papers so everything is formatted properly now. I also began to fill in the gaps on the About Me page, still a lot of gaps in my development history that I want to document if only for myself.


After having long been on my "todo list" to deep dive on a Saturday, ASP.NET SignalR finally got some attention yesterday. For those unaware, SignalR offers bi-directional communication between clients and servers over WebSockets. For those longtime readers of my blog, I did deep dive into WebSockets back in August 2012 with a WCF Service and Console App, though lost in the mix of several other technologies (MVC, WebAPI, OpenCL etc.) I had forgotten how powerful the technology was. Before I go any further, I highly suggest reading the Introduction to SignalR by Patrick Fletcher.

Fast forward to January 2015, things are even more connected to each other with REST Web Services like WebAPI, the Internet of Things and Mobile Apps from Android, iOS and Windows Phone exploding the last couple of years. Having a specific need for real-time communication from a server to client came up last Friday night for a dashboard in the next big revision of the architecture at work. The idea behind it would be to show off how every request was truly being tracked to who, when and what they were requesting or submitting. This type of audit trail I hadn't ever implemented in any of the previous three major Service Oriented architectures. In addition, presenting the data with Telerik's Kendo UI Data Visualization controls would be a nice way to show not only the audit trail functionality visually outside of the grid listing, but also to show graphs (pictures still tell a thousand words).

As I dove in, the only examples/tutorials I found were showing a simple chat. A user enters his or her name and messages, all the while without any postback the un-ordered list would dynamically update as new messages came in. Pretty neat - but what I was curious about was how one would execute a server side trigger to all the clients. Going back to my idea for enhancing my work project, it would need to be triggered by the WebAPI service, passed to the SignalR MVC app, which in turn the main MVC app would act as the client to display anything triggered originally from the WebAPI Service. So I started diving further into SignalR and in this post go over what I did (if there is a better way, please let me know in the comments). In the coming weeks I will do a follow up post as I expand the functionality to show at a basic level having three seperate projects like the one I will eventually implement at work at some point.

MVC SignalR Server Side Trigger Example

The following code/screenshots all tie back to an example I wrote for this post, you can download it here.

To begin I started with the base Visual Studio 2013 MVC Project (I will assume from here on out everyone is familiar with ASP.NET MVC):
ASP.NET Web Application Visual Studio 2013 Template
Then select the MVC Template:
ASP.NET MVC Template

Add the NuGet package for SignalR (be sure to get the full package as shown in the screenshot, not just the client):
NuGet Manager with SignalR

Upon the NuGet Package completing installation, you will need to add an OWIN Startup File as shown below:
OWIN Startup Class - Visual Studio 2013

This is crucial to SignalR working properly. For posterity here is the Startup.cs in the project I mentioned above:
using Microsoft.AspNet.SignalR; using Microsoft.Owin; using Owin; [assembly: OwinStartup(typeof(SignalRMVC.Startup))] namespace SignalRMVC {
     public class Startup {
     public void Configuration(IAppBuilder app) {
     var config = new HubConfiguration {
     EnableJavaScriptProxies = true }
; app.MapSignalR(config); }
Also new for MVC developers is the idea of a SignalR Hub. You will need to add at least one SignalR Hub class to your project, goto Add and then New Item, scroll down to the SignalR grouping and select the SignalR Hub Class (v2) option as shown in the screenshot below:
OWIN Startup Class - Visual Studio 2013

In the Hub class you define the endpoint(s) for your SignalR Server/Client relationship. For this example, I wrote a simple SendMessage function that accepts a string parameter like so:
using Microsoft.AspNet.SignalR; using Microsoft.AspNet.SignalR.Hubs; namespace SignalRMVC {
     [HubName("systemStatusHub")] public class SystemStatusHub : Hub {
     internal static void SendMessage(string logEntry) {
     var context = GlobalHost.ConnectionManager.GetHubContext(); context.Clients.All.sendData(logEntry); }
To make things a little cleaner for this example I added a BaseController with a wrapper to the SignalR Hub (mentioned above) adding in a timestamp along with the string passed from the MVC Action like so:
using System; using System.Web.Mvc; namespace SignalRMVC.Controllers {
     public class BaseController : Controller {
     internal void RecordVisit(string actionName) {
     SystemStatusHub.SendMessage(String.Format("Someone checked out the {
page at {
", actionName, DateTime.Now)); }
With the static wrapper function in place, lets look into the actual MVC Controller, HomeController:
using System.Web.Mvc; namespace SignalRMVC.Controllers {
     public class HomeController : BaseController {
     public ActionResult Index() {
     RecordVisit("home"); return View(); }
public ActionResult About() {
     RecordVisit("about"); return View(); }
public ActionResult Contact() {
     RecordVisit("contact"); return View(); }
Nothing unusual for a C# developer, simply passing an indicator based on the title of the Action.

And then in the Index.shtml, it contains the reference to the dynamically generated /signalr/hubs JavaScript file, the JavaScript Connection to the Hub and the handler for what should happen when it receives a message:

Site Activity

    Pretty simple, as the messages come in, append a li to the activityLog ul.

    Finished Product

    Below is a screenshot after clicking around from another browser:

    SignalR <span classin Action" />

    Again, if you wish to download the complete example you can download it here. In the coming weeks expect at least one more SignalR post detailing a possible solution for common Service Oriented Architectures (seperate MVC, WebAPI and SignalR hosted apps). I hope this helps someone out in the beginning of their SignalR journey.

    In doing some routine maintenance on this blog, I updated the usual JSON.NET, Entity Framework etc. In doing so and testing locally, I came across the following error:
    ASP.NET Webpages Conflict

    In looking at the Web.config, the NuGet Package did update the dependentAssembly section properly:
    ASP.NET Webpages Conflict

    However, in the appSettings section, it didn't update the webpages:Version value:
    ASP.NET Webpages Conflict

    Simply update the "" to "" and you'll be good to go again.


    For the longest time I’ve had these great ideas only to keep them in my head and then watch someone else or some company turn around and develop the idea (not to say someone stole the idea, but given the fact that there are billions of people on this planet, it is only natural to assume one of those billion would come up with the same idea). Watching this happen, as I am sure other developers have had since the 70s I’ve decided to put my outlook on things here, once a year, every July.

    As one who reads or has read my blog for a decent amount of time knows I am very much a polyglot of software and enjoy the system building/configuration/maintaining aspect of hardware. For me, they go hand in hand. The more I know about the platform itself (single threaded performance versus multi-threaded performance, disk iops etc.) the better I can program the software I develop. Likewise the more I know about a specific programming model, the better I will know the hardware that it is specialized for. To take it a step further, this makes decisions on implementation at work & my own projects better.

    As mentioned in the About Me section, I started out in QBasic and a little later when I was 12 I really started getting into custom pc building (which wasn’t anywhere as big as it is today). Digging through the massive Computer Shopper magazines, drooling over the prospect of the highest end Pentium MMX CPUs, massive (at the time) 8 GB hard drives and 19” monitors. Along with the less glamorous 90s PC issues of IRQ conflicts, pass through 3Dfx Voodoo cards that required a 2D video card (and yet another PCI slot), SCSI PCI controllers and dedicated DVD decoders. Suffice it to say I was glad I experienced all of that because it as it creates a huge appreciation for USB, PCI Express, SATA and if nothing else the stability of running a machine 24/7 on a heavy work load (yes part of that is also software).

    To return to the blog’s title…

    Thoughts on the Internet of Things?

    Universally I do follow the Internet of Things (IoT) mindset. Everything will be interconnected, which brings the question of privacy and what that means for the developer of the hardware, the software and consumer. As we all know, your data is money. If the lights in your house for instance were WiFi enabled and connected into a centralized server in your house with an exposed client on a tablet or phone I would be willing to be the hardware and software developer would love to know the total energy usage, which lights in which room were on, what type of bulbs and when the bulbs were dying. Marketing data could then be sold to let you know of bundle deals, new “more efficient” bulbs, how much time is spent in which rooms (if you are in the home theater room a lot, sell the consumer on blu-rays and snacks for instance). With each component of your home becoming this way, the more data will be captured and in some cases will be able to predict what you want before you realize, simply based off your tendencies.

    While I don’t like the lack of privacy in that model (hopefully some laws can be enacted to resolve those issues), being a software developer I would hate to be ever associated with the backlash of capturing that data, but this idea of everything being connected will create a whole new programming model. With the recent trend towards REST web services returning Gzipped JSON with WebAPI for instance, the problem of submitting and retrieving has never been easier and portable across so many platforms. With C# in particular in conjunction with the Http Client library available on NuGet, a lot of the grunt work is already done for you in an asynchronous manner. Where I do see a change, is in the standardization of an API for your lights, TV, garage door, toaster etc. Allowing 3rd party plugins and universal clients to be created rather than having a different app to control element or one company providing a proprietary API that only works on their devices, forcing the difficult decision for the consumer to either stay with that provider to be consistent or mixing the two, requiring 2 apps/devices.

    Where do I see mobile technology going?

    Much like where the Mobile Devices have headed towards (as I predicted 2 years ago), apps are becoming ever increasingly integrated into your device (for better or for worse). I don’t see this trend changing, but I do hope from a privacy standpoint the apps have to become more explicit in what they are accessing. I know there is fine line from the big three (Apple, Google and Microsoft) in becoming overly explicit before any action (remember Vista?), but think if an app gets more than your current location, the capabilities should be brought to a bolder or larger font to better convey the apps true accessibility to your device. I don’t see this situation getting better from a privacy standpoint, but I do see more and more customer demand for the “native” experience to be like that of Cortana on Windows Phone 8.1. She has access to the data you provide her and will help make your experience better. As the phones provide more and more APIs, this trend will only continue until apps are more of plugins to your base operating system’s experience to integrate into services like Yelp, Facebook, Twitter etc.

    Where do I see web technology going?

    I enjoyed diving into MVC over the last year in a half. The model definitely feels much more in line with a MVVM XAML project, but still has overwhelming strong tie to the client side between the strong use of jQuery and the level of effort in maintaining the ever changing browser space (i.e. browser updates coming out at alarming rate). While I think we all appreciate when we goto a site on our phones or desktop and it scales nicely providing a rich experience no matter the device, I feel the ultimate goal of trying to achieve a native experience in the browser is waste of effort. I know just about every web developer might stop reading and be in outrage – but what is the goal of the last web site you developed and designed that was also designed for the mobile? Was it to convey information to the masses? Or was it simply a stop gap until you had a native team to develop for the big three mobile platforms?

    In certain circumstances I agree with the stance of making HTML 5 web apps instead of native apps, especially when it comes to cost prohibiting of a project. But at a certain point, especially as of late with Xamarin’s first class citizen status with Microsoft you have to ask yourself, could I deliver a richer experience natively and possible faster (especially given the cast range of mobile browsers to content with in the HTML 5 route)?

    If you’re a C# developer who wants to see a native experience, definitely give the combination of MVVM Cross, Xamarin’s Framework and utilizing Portable Libraries a try. I wish all of those tools existed when I first dove into iOS development 4 years ago.

    Where do I see desktop apps going?

    In regards to desktop applications, I don’t see them going away even in the “app store” world we live in now. I do however see a demand for a richer experience expected by customers after having a rich native experience on their phone or after using a XAML Windows 8.x Store App. The point being, I don’t think it will be acceptable for an app to look and feel like the default WinForms grey and black color scheme that we’ve all used at one point in our careers and more than likely began our programming (thinking back to classic Visual Basic).

    Touch will also play a big factor in desktop applications (even in the enterprise). Recently at work I did a Windows 8.1 Store App for an executive dashboard. I designed the app with touch in mind, and it was interesting how it changes your perspective of interacting with data. The app in question, utilized Mutli-layered graphs and a Bing Map with several layers (heat maps and pushpins). Gone was the un-natural mouse scrolling and instead pinching, zooming and rotating as if one was in a science fiction movie from just 10 years ago.

    I see this trend continuing especially as the number of practical general purpose devices like laptops having touch screens at every price point, instead of the premium they previously demanded. All that needs to come about is a killer application for the Windows Store – could your next app be that app?

    Where is programming heading in general?

    Getting programmers out of the single-threaded – top to bottom programming mindset. I am hoping next July when I do a prediction post this won’t even be a discussion point, but sadly I don’t see this changing anytime soon. Taking a step back and looking at what this means generally speaking: programmers aren’t utilizing the hardware available to them to their full potential.

    Over 5 years ago at this point I found myself at ends with a consultant who kept asking for more and more cpus added to a particular VM. At the time when he first asked, it seemed reasonable as there was considerably more traffic coming to a particular ASP.NET 3.5 Web Application as a result of a lot of eagerly awaited functionality he and his team had just deployed. Even after the additional CPUS were added, his solution was still extremely slow under no load. This triggered me to review his Subversion checkins and I realized the crux of the matter wasn’t the server – it was his single threaded resource intensive/time consuming code. In this case, the code was poorly written on top of trying to achieve a lot of work performed on a particular page. For those that remember back to .NET 3.5’s implementation of LINQ, it wasn’t exactly a strong performer in performance intensive applications, let alone being looped through multiple times as opposed to one larger LINQ Query. The moral of the story being the single-threaded coded only helped for handling the increased load, not the performance of a user’s experience on a 0% load session.

    A few months later when .NET 4 came out of beta and further still when the Task Parallel Library was released it changed my view on performance (After all, jcBENCH stemmed from my passion for diving into Parallel Programming on different architectures and operating systems back in January 2012). No longer was I relying on high single threaded performing cpus, bu t instead writing my code to take advantage of the ever-increasing # of cores available to me at this particular client (for those curious 2U 24 core Opteron HP G5 rackmount servers).

    With .NET’s 4.5’s async/await I was hopeful that meant more developers I worked with would take advantage of this easy model and no longer lock the UI thread, but I was largely disappointed. If developers couldn’t grasp async/await, let alone TPL how could they proceed to what I feel is an even bigger breakthrough to become available to developers: Heterogeneous Programming, or more specifically OpenCL.

    With parallel programming comes the need to break down your problem into independent problems, all coming together at a later time (like breaking down image processing to look at a range of pixels rather than the entire image for instance). This is where Heterogeneous Programming can make an even bigger impact, in particular with GPUs (Graphics Processing Units) which have upwards of hundreds of cores to process tasks.

    I had dabbled in OpenCL as far as back as June 2012 in working on the OpenCL version of jcBENCH and I did some further research back in January/February of this year (2014) in preparation for a large project at work – a project I ended up using the TPL extensively instead. The problem wasn’t OpenCL’s performance, but my mindset at the time. Before the project began, I thought I knew the problem inside out, but really I only knew it as a human would think about it – not a machine that only knows 1s and 0s. The problem wasn’t a simple task, nor was it something I had ever even attempted previously so I gave myself some slack two months in when it finally hit me on what I was really trying to solve – teaching a computer to think like a human. Therefore when pursuing Heterogeneous programming as a possible solution, ensure you have a 100% understanding of the problem and what you are in the end trying to achieve, in most cases it might make sense to utilize OpenCL instead of a traditional parallel model like with the TPL.

    So why OpenCL outside of the speed boost? Think about the last laptop or desktop you bought, chances are you have an OpenCL 1.x compatible APU and/or GPU in it (i.e. you aren’t required to spend any more money – just utilizing what has already been available to you). In particular on the portable side, laptops/Ultrabooks that already have a lower performing CPU than your desktop, why utilize the CPU when the GPU could off load some of that work?

    The only big problem with OpenCL for C# programmers is the lack of an officially supported interop library from AMD, Apple or any of the other members of the OpenCL group. Instead you’re at the mercy of using one of the freely available wrapper libraries like OpenCL.NET or simply writing your own wrapper. I haven’t made up my mind yet as to which path I will go down – but I know at some point a middle ware makes sense. Wouldn’t it be neat to have a generic work item and be able to simply pass it off to your GPU(s) when you wanted?

    As far as where to begin with OpenCL in general, I strongly suggest reading the OpenCL Programming Guide. Those who have done OpenGL and are familiar with the “Red Book”, this book follows a similar pattern with a similar expectation and end result.


    Could I be way off? Sure – it’s hard to predict the future, while being grounded in the past that brought us here, meaning it’s hard to let go of how we as programmers and technologists in the world have evolved in the last 5 years to satisfy not only our current consumer demand but our own and anticipate what is next. What I am more curious in hearing is programmers outside of the CLR in particular the C++, Java and Python crowds – where they feel the industry is heading and how they see their programming language handling the future, so please leave comments.
    Last night I presented at CMAP's main meeting on using SQL Server's Geospatial Data functionality with MVC and Windows Phone with a focus on getting exposed to using the data types and the associated SQLCLR functions.

    As promised, you can download the full Visual Studio 2013 solution (PCL, MVC/WebAPI application and the Windows Phone 8.1 application), in addition to the SQL script to run on your database and the Powerpoint presentation itself here.

    Included in the zip file as well is a Readme.txt with the instructions for what you would need to update before running the solution:
    -Update the Database Connection String in the Web.Config
    -Update the WebAPI Service's connection string in the PCL
    -Update the Bing Maps API Key in the Index/View of the MVC App

    Any questions, concerns or comments leave them below or email me at jarred at jarredcapellman dot com. As stated during the presentation, this is not a production ready solution (there is no caching, error handling etc.), but merely a demo for one who wants to see how you can begin to use Spatial Data in your applications.
    After wrapping up Phase One of my migration from WordPress to MVC4, I began diving into the admin side of the migration trying to replicate a lot of ease of use WordPress offered while adding my own touches. To begin I started with the Add/Edit Post form.

    After adding in my view:
    @model bbxp.mvc.Models.PostModel
        Layout = "~/Views/Shared/_AdminLayout.cshtml";
    @using (Html.BeginForm("SavePost", "bbxpAdmin", FormMethod.Post)) { if (@Model.PostID.HasValue) { }





    And then my code behind in the controller:
    public ActionResult AddEditPost(int? PostID) {
        if (!checkAuth()) {
            var lModel = new Models.LoginModel();
            lModel.ErrorMessage = "Authentication failed...";
            return View("Index", lModel);
        var pModel = new Models.PostModel();
        if (PostID.HasValue) {
            using (var pFactory = new PostFactory()) {
                var post = pFactory.GetPost(PostID.Value);
                pModel.Body = post.Body;
                pModel.PostID = post.ID;
                pModel.Title = post.Title;
                pModel.Tags = string.Join(", ", post.Tags.Select(a => a.Name).ToList());
                pModel.Categories = String.Empty;
        return View(pModel);
    My post back ActionResult in my Controller never got hit. After inspecting the outputted HTML I noticed the form's action was empty:
    form action="" method="post"
    Having a hunch it was a result of a bad route, I checked my Global.asax.cs file and added a specific route to handle the Action/Controller:
    routes.MapRoute(name: "bbxpAddEditPost", url: "bbxpadmin/{action}/{PostID}", defaults: new { controller = "bbxpAdmin", action = "AddEditPost"});
    Sure enough immediately following adding the route, the form posted back properly and I was back at work on adding additional functionality to the backend. Hopefully that helps someone else out as I only found one unanswered StackOverflow post on this issue. I should also note, a handy feature when utilizing Output Caching as discussed in a previous post is to programmatically reset the cache.

    In my case I added the following in my SavePost ActionResult:
    Response.RemoveOutputCacheItem(Url.Action("Index", "Home"));
    This removes the cached copy of my main Post Listing.
    In today's post I will be diving into adding Search Functionality, Custom Error Pages and MVC Optimizations. Links to previous parts: Part 1, Part 2, Part 3, Part 4, Part 5, Part 6 and Part 7.

    Search Functionality

    A few common approaches to adding search functionality to a Web Application:

    Web App Search Approaches

    1. Pull down all of the data and then search on it using a for loop or LINQ - An approach I loathe because to me this is a waste of resources, especially if the content base you're pulling from is of a considerable amount. Just ask yourself, if you were at a library and you knew the topic you were looking for, would you pull out all of the books in the entire library and then filter down or simply find the topic's section and get the handful of books?
    2. Implement a Stored Procedure with a query argument and return the results - An approach I have used over the years, it is easy to implement and for me it leaves the querying where it should be - in the database.
    3. Creating a Search Class with a dynamic interface and customizable properties to search and a Stored Procedure backend like in Approach 2 - An approach I will be going down at a later date for site wide search of a very large/complex WebForms app.
    For the scope of this project I am going with Option #2 since the scope of the content I am searching for only spans the Posts objects. At a later date in Phase 2 I will probably expand this to fit Option #3. However since I will want to be able to search on various objects and return them all in a meaningful way, fast and efficiently. So let's dive into Option #2. Because the usage of virtually the same block of SQL is being utilized in many Stored Procedures at this point, I created a SQL View: [sql] CREATE VIEW dbo.ActivePosts AS SELECT dbo.Posts.ID, dbo.Posts.Created, dbo.Posts.Title, dbo.Posts.Body, dbo.Users.Username, dbo.Posts.URLSafename, dbo.getTagsByPostFUNC(dbo.Posts.ID) AS 'TagList', dbo.getSafeTagsByPostFUNC(dbo.Posts.ID) AS 'SafeTagList', (SELECT COUNT(*) FROM dbo.PostComments WHERE dbo.PostComments.PostID = dbo.Posts.ID AND dbo.PostComments.Active = 1) AS 'NumComments' FROM dbo.Posts INNER JOIN dbo.Users ON dbo.Users.ID = dbo.Posts.PostedByUserID WHERE dbo.Posts.Active = 1 [/sql] And then create a new Stored Procedures with the ability to search content and reference the new SQL View: [sql] CREATE PROCEDURE [dbo].[getSearchPostListingSP] (@searchQueryString VARCHAR(MAX)) AS SELECT dbo.ActivePosts.* FROM dbo.ActivePosts WHERE (dbo.ActivePosts.Title LIKE '%' + @searchQueryString + '%' OR dbo.ActivePosts.Body LIKE '%' + @searchQueryString + '%') ORDER BY dbo.ActivePosts.Created DESC [/sql] You may be asking why not simply add the ActivePosts SQL View to your Entity Model and do something like this in your C# code:
    public List<ActivePosts> GetSearchPostResults(string searchQueryString) {
         using (var eFactory = new bbxp_jarredcapellmanEntities()) {
         return eFactory.ActivePosts.Where(a => a.Title.Contains(searchQueryString) || a.Body.Contains(searchQueryString)).ToList(); }
    That's perfectly valid and I am not against doing it that, but I feel like code like that should be done at the Database level, thus the Stored Procedure. Granted Stored Procedures do add a level of maintenance over doing it via code. For one, anytime you update/add/remove columns you have to update the Complex Type in your Entity Model inside of Visual Studio and then update your C# code that makes reference to that Stored Procedure. For me it is worth it, but to each their own. I have not made performance comparisons on this particular scenario, however last summer I did do some aggregate performance comparisons between LINQ, PLINQ and Stored Procedures in my in C#">LINQ vs PLINQ vs Stored Procedure Row Count Performance in C#. You can't do a 1 to 1 comparison between varchar column searching and aggregate function performance, but my point, or better put, my lesson I want to convey is to definitely keep an open mind and explore all possible routes. You never want to find yourself in a situation of stagnation in your software development career simply doing something because you know it works. Things change almost daily it seems - near impossible as a polyglot programmer to keep up with every change, but when a new project comes around at work do your homework even if it means sacrificing your nights and weekends. The benefits will become apparent instantly and for me the most rewarding aspect - knowing when you laid down that first character in your code you did so with the knowledge of what you were doing was the best you could provide to your employer and/or clients. Back to implementing the Search functionality, I added the following function to my PostFactory class:
    public List<Objects.Post> GetSearchResults(string searchQueryString) {
         using (var eFactory = new bbxp_jarredcapellmanEntities()) {
         return eFactory.getSearchPostListingSP(searchQueryString).Select(a => new Objects.Post(a.ID, a.Created, a.Title, a.Body, a.TagList, a.SafeTagList, a.NumComments.Value, a.URLSafename)).ToList(); }
    You might see the similarity to other functions if you've been following this series. The function exposed in an Operation Contract inside the WCF Service:
    [OperationContract] List<lib.Objects.Post> GetPostSearchResults(string searchQueryString); public List<Post> GetPostSearchResults(string searchQueryString) {
         using (var pFactory = new PostFactory()) {
         return pFactory.GetSearchResults(searchQueryString); }
    Back in the MVC App I created a new route to handle searching:
    routes.MapRoute("Search", "Search/{
    ", new {
         controller = "Home", action = "Search" }
    ); ]]>
    So now I can enter values via the url like so: Would search all Posts that contained mvc in the title or body. Then in my Controller class:
    [AcceptVerbs(HttpVerbs.Post)] public ActionResult Search(string searchQueryString) {
         ViewBag.Title = searchQueryString + " << Search Results << " + Common.Constants.SITE_NAME; var model = new Models.HomeModel(baseModel); using (var ws = new WCFServiceClient()) {
         model.Posts = ws.GetPostSearchResults(searchQueryString); }
    ViewBag.Model = model; return View("Index", model); }
    In my partial view:
    <div class="Widget"> <div class="Title"> <h3>Search Post History</h3> </div> <div class="Content"> @using (Html.BeginForm("Search", "Home", new {
         searchQueryString = "searchQueryString"}
    , FormMethod.Post)) {
         <input type="text" id="searchQueryString" name="searchQueryString" class="k-textbox" required placeholder="enter query here" /> <button class="k-button" type="submit">Search >></button> }
    </div> </div> ]]>
    When all was done: [caption id="attachment_2078" align="aligncenter" width="252"]Search box <span classin MVC App" width="252" height="171" class="size-full wp-image-2078" /> Search box in MVC App[/caption] Now you might be asking, what if there are no results? Your get an empty view: [caption id="attachment_2079" align="aligncenter" width="300"]Empty Result - Wrong way to handle it Empty Result - Wrong way to handle it[/caption] This leads me to my next topic:

    Custom Error Pages

    We have all been on sites where we go some place we either don't have access to, doesn't exist anymore or we misspelled. WordPress had a fairly good handler for this scenario: [caption id="attachment_2081" align="aligncenter" width="300"]WordPress Content not found Handler WordPress Content not found Handler[/caption] As seen above when no results are found, we want to let the user know, but also create a generic handler for other error events. To get started let's add a Route to the Global.asax.cs:
    routes.MapRoute("Error", "Error", new {
         controller = "Error", action = "Index" }
    ); ]]>
    This will map to /Error with a tie to an ErrorController and a Views/Error/Index.cshtml. And my ErrorController:
    public class ErrorController : BaseController {
         public ActionResult Index() {
         var model = new Models.ErrorModel(baseModel); return View(model); }
    And my View:
    @model bbxp.mvc.Models.ErrorModel <div class="errorPage"> <h2>Not Found</h2> <div class="content"> Sorry, but you are looking for something that isn't here. </div> </div> ]]>
    Now you maybe asking why isn't the actual error going to be passed into the Controller to be displayed? For me I personally feel a generic error message to the end user while logging/reporting the errors to administrators and maintainers of a site is the best approach. In addition, a generic message protects you somewhat from exposing sensitive information to a potential hacker such as "No users match the query" or worse off database connection information. That being said I added a wrapper in my BaseController:
    public ActionResult ThrowError(string exceptionString) {
         // TODO: Log errors either to the database or email powers that be return RedirectToAction("Index", "Error"); }
    This wrapper will down the road record the error to the database and then email users with alerts turned on. Since I haven't started on the "admin" section I am leaving it as is for the time being. The reason for the argument being there currently is that so when that does happen all of my existing front end code is already good to go as far as logging. Now that I've got my base function implemented, let's revisit the Search function mentioned earlier:
    public ActionResult Search(string searchQueryString) {
         ViewBag.Title = searchQueryString + " << Search Results << " + Common.Constants.SITE_NAME; var model = new Models.HomeModel(baseModel); using (var ws = new WCFServiceClient()) {
         model.Posts = ws.GetPostSearchResults(searchQueryString); }
    if (model.Posts.Count == 0) {
         ThrowError(searchQueryString + " returned 0 results"); }
    ViewBag.Model = model; return View("Index", model); }
    Note the If conditional and the call to the ThrowError, no other work is necessary. As implemented: [caption id="attachment_2083" align="aligncenter" width="300"]Not Found Error Handler Page <span classin the MVC App" width="300" height="81" class="size-medium wp-image-2083" /> Not Found Error Handler Page in the MVC App[/caption] Where does this leave us? The final phase in development: Optimization.


    You might be wondering why I left optimization for last? I feel as though premature optimization leads to not only a longer debugging period when nailing down initial functionality, but also if you do things right as you go on your optimizations are really just tweaking. I've done both approaches in my career and definitely have had more success with doing it last. If you've had the opposite experience please comment below, I would very much like to hear your story. So where do I want to begin?

    YSlow and MVC Bundling

    For me it makes sense to do the more trivial checks that provide the most bang for the buck. A key tool to assist in this manner is YSlow. I personally use the Firefox Add-on version available here. As with any optimization, you need to do a baseline check to give yourself a basis from which to improve. In this case I am going from a fully featured PHP based CMS, WordPress to a custom MVC4 Web App so I was very intrigued by the initial results below. [caption id="attachment_2088" align="aligncenter" width="300"]WordPress YSlow Ratings WordPress YSlow Ratings[/caption] [caption id="attachment_2089" align="aligncenter" width="300"]Custom MVC 4 App YSlow Results Custom MVC 4 App YSlow Ratings[/caption] Only scoring 1 point less than the battle tested WordPress version with no optimizations I feel is pretty neat. Let's now look into what YSlow marked the MVC 4 App down on. In the first line item, it found that the site is using 13 JavaScript files and 8 CSS files. One of the neat MVC features is the idea of bundling multiple CSS and JavaScript files into one. This not only cuts down on the number of HTTP Requests, but also speeds up the initial page load where most of your content is subsequently cached on future page requests. If you recall going back to an earlier post our _Layout.cshtml we included quite a few CSS and JavaScript files:
    <link href="@Url.Content("~/Content/Site.css")" rel="stylesheet" type="text/css" /> <link href="@Url.Content("~/Content/kendo/2013.1.319/kendo.common.min.css")" rel="stylesheet" type="text/css" /> <link href="@Url.Content("~/Content/kendo/2013.1.319/kendo.dataviz.min.css")" rel="stylesheet" type="text/css" /> <link href="@Url.Content("~/Content/kendo/2013.1.319/kendo.default.min.css")" rel="stylesheet" type="text/css" /> <link href="@Url.Content("~/Content/kendo/2013.1.319/kendo.dataviz.default.min.css")" rel="stylesheet" type="text/css" /> <script src="@Url.Content("~/Scripts/kendo/2013.1.319/jquery.min.js")"></script> <script src="@Url.Content("~/Scripts/kendo/2013.1.319/kendo.all.min.js")"></script> <script src="@Url.Content("~/Scripts/kendo/2013.1.319/kendo.aspnetmvc.min.js")"></script> <script src="@Url.Content("~/Scripts/kendo.modernizr.custom.js")"></script> <script src="@Url.Content("~/Scripts/syntaxhighlighter/shCore.js")" type="text/javascript"></script> <link href="@Url.Content("~/Content/syntaxhighlighter/shCore.css")" rel="stylesheet" type="text/css" /> <link href="@Url.Content("~/Content/syntaxhighlighter/shThemeRDark.css")" rel="stylesheet" type="text/css" /> <script src="@Url.Content("~/Scripts/syntaxhighlighter/shBrushCSharp.js")" type="text/javascript"></script> <script src="@Url.Content("~/Scripts/syntaxhighlighter/shBrushPhp.js")" type="text/javascript"></script> <script src="@Url.Content("~/Scripts/syntaxhighlighter/shBrushXml.js")" type="text/javascript"></script> <script src="@Url.Content("~/Scripts/syntaxhighlighter/shBrushCpp.js")" type="text/javascript"></script> <script src="@Url.Content("~/Scripts/syntaxhighlighter/shBrushBash.js")" type="text/javascript"></script> <script src="@Url.Content("~/Scripts/syntaxhighlighter/shBrushSql.js")" type="text/javascript"></script> <script src="@Url.Content("~/Scripts/lightbox/jquery-1.7.2.min.js")" type="text/javascript"></script> <script src="@Url.Content("~/Scripts/lightbox/lightbox.js")" type="text/javascript"></script> <link href="@Url.Content("~/Content/lightbox/lightbox.css")" rel="stylesheet" type="text/css" /> ]]>
    Let's dive into Bundling all of our JavaScript files. First off create a new class, I called it BundleConfig and inside this class add the following static function:
    public static void RegisterBundles(BundleCollection bundles) {
         // JavaScript Files bundles.Add(new ScriptBundle("~/Bundles/kendoBundle") .Include("~/Scripts/kendo/2013.1.319/jquery.min.js") .Include("~/Scripts/kendo/2013.1.319/kendo.all.min.js") .Include("~/Scripts/kendo/2013.1.319/kendo.aspnetmvc.min.js") .Include("~/Scripts/kendo.modernizr.custom.js") ); bundles.Add(new ScriptBundle("~/Bundles/syntaxBundle") .Include("~/Scripts/syntaxhighlighter/shCore.js") .Include("~/Scripts/syntaxhighlighter/shBrushCSharp.js") .Include("~/Scripts/syntaxhighlighter/shBrushPhp.js") .Include("~/Scripts/syntaxhighlighter/shBrushXml.js") .Include("~/Scripts/syntaxhighlighter/shBrushCpp.js") .Include("~/Scripts/syntaxhighlighter/shBrushBash.js") .Include("~/Scripts/syntaxhighlighter/shBrushSql.js") ); bundles.Add(new ScriptBundle("~/Bundles/lightboxBundle") .Include("~/Scripts/lightbox/jquery-1.7.2.min.js") .Include("~/Scripts/lightbox/lightbox.js") ); }
    Then in your _Layout.cshtml replace all of the original JavaScript tags with the following 4 lines:
    @Scripts.Render("~/Bundles/kendoBundle") @Scripts.Render("~/Bundles/syntaxBundle") @Scripts.Render("~/Bundles/lightboxBundle") ]]>
    So afterwards that block of code should look like:
    <link href="@Url.Content("~/Content/Site.css")" rel="stylesheet" type="text/css" /> <link href="@Url.Content("~/Content/kendo/2013.1.319/kendo.common.min.css")" rel="stylesheet" type="text/css" /> <link href="@Url.Content("~/Content/kendo/2013.1.319/kendo.dataviz.min.css")" rel="stylesheet" type="text/css" /> <link href="@Url.Content("~/Content/kendo/2013.1.319/kendo.default.min.css")" rel="stylesheet" type="text/css" /> <link href="@Url.Content("~/Content/kendo/2013.1.319/kendo.dataviz.default.min.css")" rel="stylesheet" type="text/css" /> <link href="@Url.Content("~/Content/syntaxhighlighter/shCore.css")" rel="stylesheet" type="text/css" /> <link href="@Url.Content("~/Content/syntaxhighlighter/shThemeRDark.css")" rel="stylesheet" type="text/css" /> <link href="@Url.Content("~/Content/lightbox/lightbox.css")" rel="stylesheet" type="text/css" /> @Scripts.Render("~/Bundles/kendoBundle") @Scripts.Render("~/Bundles/syntaxBundle") @Scripts.Render("~/Bundles/lightboxBundle") ]]>
    Finally go to your Global.asax.cs file and inside your Application_Start function add the following line:
    BundleConfig.RegisterBundles(BundleTable.Bundles); ]]>
    So in the end your Application_Start function should look like:
    protected void Application_Start() {
         AreaRegistration.RegisterAllAreas(); RegisterGlobalFilters(GlobalFilters.Filters); RegisterRoutes(RouteTable.Routes); BundleConfig.RegisterBundles(BundleTable.Bundles); }
    Now after re-running the YSlow test: [caption id="attachment_2092" align="aligncenter" width="300"]YSlow Ratings after Bundling of JavaScript Files <span classin the MVC App" width="300" height="190" class="size-medium wp-image-2092" /> YSlow Ratings after Bundling of JavaScript Files in the MVC App[/caption] Much improved, now we're rated better than WordPress itself. Now onto the bundling of the CSS styles. Add the following below the previously added ScriptBundles in your BundleConfig class:
    // CSS Stylesheets bundles.Add(new StyleBundle("~/Bundles/stylesheetBundle") .Include("~/Content/Site.css") .Include("~/Content/lightbox/lightbox.css") .Include("~/Content/syntaxhighlighter/shCore.css") .Include("~/Content/syntaxhighlighter/shThemeRDark.css") .Include("~/Content/kendo/2013.1.319/kendo.common.min.css") .Include("~/Content/kendo/2013.1.319/kendo.dataviz.min.css") .Include("~/Content/kendo/2013.1.319/kendo.default.min.css") .Include("~/Content/kendo/2013.1.319/kendo.dataviz.default.min.css") ); ]]>
    And then in your _Layout.cshtml add the following in place of all of your CSS includes:
    @Styles.Render("~/Bundles/stylesheetBundle") ]]>
    So when you're done, that whole block should look like the following:
    @Styles.Render("~/Bundles/stylesheetBundle") @Scripts.Render("~/Bundles/kendoBundle") @Scripts.Render("~/Bundles/syntaxBundle") @Scripts.Render("~/Bundles/lightboxBundle") ]]>
    One thing that I should note is if your Bundling isn't working check your Routes. Because of my Routes, after deployment (and making sure the is set to false), I was getting 404 errors on my JavaScript and CSS Bundles. My solution was to use the IgnoreRoutes method in my Global.asax.cs file:
    routes.IgnoreRoute("Bundles/*"); ]]>
    For completeness here is my complete RegisterRoutes:
    "); routes.MapHttpRoute( name: "DefaultApi", routeTemplate: "api/{
    ", defaults: new {
         id = RouteParameter.Optional }
    ); routes.IgnoreRoute("Bundles/*"); routes.MapRoute("Error", "Error/", new {
         controller = "Error", action = "Index" }
    ); routes.MapRoute("Search", "Search/{
    ", new {
         controller = "Home", action = "Search" }
    ); routes.MapRoute("Feed", "Feed", new {
        controller = "Home", action = "Feed"}
    ); routes.MapRoute("Tags", "tag/{
    ", new {
        controller = "Home", action = "Tags"}
    ); routes.MapRoute("PostsRoute", "{
    ", new {
         controller = "Home", action = "Posts" }
    , new {
         year = @"\d+" }
    ); routes.MapRoute("ContentPageRoute", "{
    ", new {
        controller = "Home", action = "ContentPage"}
    ); routes.MapRoute("PostRoute", "{
    ", new {
         controller = "Home", action = "SinglePost" }
    , new {
         year = @"\d+", month = @"\d+", day = @"\d+" }
    ); routes.MapRoute("Default", "{
    ", new {
         controller = "Home", action = "Index" }
    ); ]]>
    Afterwards everything was set properly and if you check your source code you'll notice how MVC generates the HTML:
    <link href="/Bundles/stylesheetBundle?v=l3WYXmrN_hnNspLLaGDUm95yFLXPFiLx613TTF4zSKY1" rel="stylesheet"/> <script src="/Bundles/kendoBundle?v=-KrP5sDXLpezNwcL3Evn9ASyJPShvE5al3knHAy2MOs1"></script> <script src="/Bundles/syntaxBundle?v=NQ1oIC63jgzh75C-QCK5d0B22diL-20L4v96HctNaPo1"></script> <script src="/Bundles/lightboxBundle?v=lOBITxhp8sGs5ExYzV1hgOS1oN3p1VUnMKCjnAbhO6Y1"></script> ]]>
    After re-running YSlow: [caption id="attachment_2095" align="aligncenter" width="300"]YSlow after all bundling <span classin MVC" width="300" height="215" class="size-medium wp-image-2095" /> YSlow after all bundling in MVC[/caption] Now we received a score of 96. What's next? Caching.

    MVC Caching

    Now that we've reduced the amount of data being pushed out to the client and optimized the number of http requests, lets switch gears to reducing the load on the server and enhance the performance of your site. Without diving into all of the intricacies of caching, I am going to turn on server side caching, specifically Output Caching. At a later date I will dive into other approaches of caching including the new HTML5 client side caching that I recently dove into. That being said, turning on Output Caching in your MVC application is really easy, simply put the OutputCache Attribute above your ActionResults like so:
    [OutputCache(Duration = 3600, VaryByParam = "*")] public ActionResult SinglePost(int year, int month, int day, string postname) {
         ----- }
    In this example, the ActionResult will be cached for one hour (3600 seconds = 1 hour) and by setting the VaryByParam to * that means each combination of arguments passed into the function is cached versus caching one argument combination and displaying the one result. I've seen developers simply turn on caching and not thinking about dynamic content - suffice it to say, think about what could be cached and what can't. Common items that don't change often like your header or sidebar can be cached without much thought, but think about User/Role specific content and how bad it would be for a "Guest" user to see content as a Admin because an Admin had accessed the page within the cache time before a Guest user had.


    In this post I went through the last big three items left in my migration from WordPress to MVC: Search Handling, Custom Error Pages and Caching. That being said I have a few "polish" items to accomplish before switching over the site to all of the new code, namely additional testing and adding a basic admin section. After those items I will consider Phase 1 completed and go back to my Windows Phone projects. Stay tuned for Post 9 tomorrow night with the polish items.
    Can't believe it's been a week to the day when I began this project, but I am glad at the amount of progress I have made on the project thus far. Tonight I will dive into adding a WCF Service to act as a layer in between the logic and data layer done in previous posts Part 1, Part 2, Part 3, Part 4, Part 5 and Part 6) and add adding RSS support to the site.

    Integrating a WCF Service

    First off, for those that aren't familiar, WCF (Windows Communication Foundation) is an extremely powerful Web Service Technology created by Microsoft. I first dove into WCF April 2010 when diving into Windows Phone development as there was no support for the "classic" ASMX Web Services. Since then I have used WCF Services as the layer for all ASP.NET WebForms, ASP.NET MVC, Native Mobile Apps and other WCF Services at work since. I should note, WCF to WCF communication is done at the binary level, meaning it doesn't send XML between the services, something I found extremely enlightening that Microsoft implemented. At it's most basic level a WCF Service is comprised of two components, the Service Interface Definition file and the actual implementation. In the case of the migration, I created my Interface as follows:
    [ServiceContract] public interface IWCFService {
         [OperationContract] lib.Objects.Post GetSinglePost(int year, int month, int day, string postname); [OperationContract] List<lib.Objects.Comment> GetCommentsFromPost(int postID); [OperationContract(IsOneWay = true)] void AddComment(string PersonName, string EmailAddress, string Body, int PostID); [OperationContract] lib.Objects.Content GetContent(string pageName); [OperationContract] List<lib.Objects.Post> GetPosts(DateTime startDate, DateTime endDate); [OperationContract] List<lib.Objects.Post> GetPostsByTags(string tagName); [OperationContract] List<lib.Objects.ArchiveItem> GetArchiveList(); [OperationContract] List<lib.Objects.LinkItem> GetLinkList(); [OperationContract] List<lib.Objects.TagCloudItem> GetTagCloud(); [OperationContract] List<lib.Objects.MenuItem> GetMenuItems(); }
    The one thing to note, IsOneWay a top of the AddComment function indicates, the client doesn't expect a return value. As noted in last night's post, the end user is not going to want to wait for all the emails to be sent, they simply want their comment to be posted and the Comment Listing refreshed with their comment. By setting the IsOneWay to true, you ensure the client's experience is fast no matter the server side work being done. And the actual implementation:
    public class WCFService : IWCFService {
         public Post GetSinglePost(int year, int month, int day, string postname) {
         using (var pFactory = new PostFactory()) {
         var post = pFactory.GetPost(postname)[0]; post.Comments = pFactory.GetCommentsFromPost(post.ID); return post; }
    public List<Comment> GetCommentsFromPost(int postID) {
         using (var pFactory = new PostFactory()) {
         return pFactory.GetCommentsFromPost(postID); }
    public void AddComment(string PersonName, string EmailAddress, string Body, int PostID) {
         using (var pFactory = new PostFactory()) {
         pFactory.addComment(PostID, PersonName, EmailAddress, Body); }
    public Content GetContent(string pageName) {
         using (var cFactory = new ContentFactory()) {
         return cFactory.GetContent(pageName); }
    public List<Post> GetPosts(DateTime startDate, DateTime endDate) {
         using (var pFactory = new PostFactory()) {
         return pFactory.GetPosts(startDate, endDate); }
    public List<Post> GetPostsByTags(string tagName) {
         using (var pFactory = new PostFactory()) {
         return pFactory.GetPostsByTags(tagName); }
    public List<ArchiveItem> GetArchiveList() {
         using (var pFactory = new PostFactory()) {
         return pFactory.GetArchiveList(); }
    public List<LinkItem> GetLinkList() {
         using (var pFactory = new PostFactory()) {
         return pFactory.GetLinkList(); }
    public List<TagCloudItem> GetTagCloud() {
         using (var pFactory = new PostFactory()) {
         return pFactory.GetTagCloud(); }
    public List<MenuItem> GetMenuItems() {
         using (var bFactory = new BaseFactory()) {
         return bFactory.GetMenuItems(); }
    One thing you might be asking, isn't this a security risk? If you're not, you should. Think about it, anyone who has access to your WCF Service could add comments and pull down your data at will. In its current state, this isn't a huge deal since it is only returning data and the AddComment Operation Contract requires a prior approved comment to post, but what about when the administrator functionality is implemented? You definitely don't want to expose your contracts to the outside world with only the parameters needed. So what can you do?
    1. Keep your WCF Service not exposed to the internet - this is problematic in today's world where a mobile presence is almost a necessity. Granted if one were to only create a MVC 4 Mobile Web Application you could keep it behind a firewall. My thought process currently is design and do it right the first time and don't corner yourself into a position where you have to go back and do additional work.
    2. Add username, password or some token to the each Operation Contract and then verify the user - this approach works and I've done it that way for public WCF Services. The problem becomes more of a lot of extra work on both the client and server side. Client Side you can create a base class with the token, username/password and simply pass it into each contract and then server side do a similar implementation
    3. Implement a message level or Forms Membership - This approach requires the most upfront work, but reaps the most benefits as it keeps your Operation Contracts clean and offers an easy path to update at a later date.
    Going forward I will be implementing the 3rd option and of course I will document the process. Hopefully this help get developers thinking about security and better approaches to problems. Moving onto the second half of the post, creating an RSS Feed.

    Creating an RSS Feed

    After getting my class in my WCF Service, I created a new Stored Procedure in preparation: [sql] CREATE PROCEDURE dbo.getRSSFeedListSP AS SELECT TOP 25 dbo.Posts.Created, dbo.Posts.Title, LEFT(CAST(dbo.Posts.Body AS VARCHAR(MAX)), 200) + '...' AS 'Summary', dbo.Posts.URLSafename FROM dbo.Posts INNER JOIN dbo.Users ON dbo.Users.ID = dbo.Posts.PostedByUserID WHERE dbo.Posts.Active = 1 ORDER BY dbo.Posts.Created DESC [/sql] Basically this will return the most recent 25 posts and up to the first 200 characters of the post. Afterwards I created a class to translate the Entity Framework Complex Type:
    [DataContract] public class PostFeedItem {
         [DataMember] public DateTime Published {
         get; set; }
    [DataMember] public string Title {
         get; set; }
    [DataMember] public string Description {
         get; set; }
    [DataMember] public string URL {
         get; set; }
    public PostFeedItem(DateTime published, string title, string description, string url) {
         Published = published; Title = title; Description = description; URL = url; }
    And then I added a new Operation Contract in my WCF Service:
    public List<lib.Objects.PostFeedItem> GetFeedList() {
         using (var pFactory = new PostFactory()) {
         return pFactory.GetFeedList(); }
    Now I am going to leave it up to you which path to implement. At this point you've got all backend work done to return the data you need to write your XML file for RSS. There are many approaches to how you want to go about to proceeding, and it really depends on how you want to serve your RSS Feed. Do you want it to regenerate on the fly for each request? Or do you want to write an XML file only when a new Post is published and simply serve the static XML file? From what my research gave me, there are multiple ways to do each of those. For me I am in favor of doing the work once and writing it out to a file rather than doing all of that work on each request. The later seems like a waste of server resources. Generate Once
    1. One being using the Typed DataSet approach I used in Part 1 - requires very little work and if you're like me, you like a strongly typed approach.
    2. Another option is to use the SyndicationFeed built in class to create your RSS Feed's XML - an approach I hadn't researched prior to for generating one
    3. Using the lower level XmlWriter functionality in .Net to build your RSS Feed's XML - I strongly urge you to not do this with the 2 approaches above being strongly typed. Unstrongly Typed code leads to spaghetti and a debugging disaster when something goes wrong.
    Generate On-Thee-Fly
    1. Use the previously completed WCF OperationContract to simply return the data and then use something like MVC Contrib to return a XmlResult in your MVC Controller.
    2. Set your MVC View to return XML and simply iterate through all of the Post Items
    Those are just some ways to accomplish the goal of creating a RSS Feed for your MVC site. Which is right? I think it is up to you to find what works best for you. That being said, I am going to walk through how to do the first 2 Generate Once Options. For both approaches I am going to use IIS's UrlRewrite functionality to route to For those interested, all it took was the following block in my web.config in the System.WebService section: [xml] <rewrite> <rules> <rule name="RewriteUserFriendlyURL1" stopProcessing="true"> <match url="^feed$" /> <conditions> <add input="{
    " matchType="IsFile" negate="true" /> <add input="{
    " matchType="IsDirectory" negate="true" /> </conditions> <action type="Rewrite" url="rss.xml" /> </rule> </rules> </rewrite> [/xml] To learn more about URL Rewrite go the official site here.

    Option 1 - XSD Approach

    Utilizing a similar approach to how I got started, utilizing the XSD tool in Part 1, I generated a typed dataset based on the format of an RSS XML file: [xml] <?xml version="1.0"?> <rss version="2.0"> <channel> <title>Jarred Capellman</title> <link></link> <description>Putting 1s and 0s to work since 1995</description> <language>en-us</language> <item> <title>Version 2.0 Up!</title> <link></link> <description>Yeah in all its glory too, it's far from complete, the forum will be up tonight most likely...</description> <pubDate>5/4/2012 12:00:00 AM</pubDate> </item> </channel> </rss> [/xml] [caption id="attachment_2056" align="aligncenter" width="300"]Generated Typed Data Set <span classfor RSS" width="300" height="151" class="size-medium wp-image-2056" /> Generated Typed Data Set for RSS[/caption] Then in my HomeController, I wrote a function to handle writing the XML to be called when a new Post is entered into the system:
    private void writeRSSXML() {
         var dt = new rss(); using (var ws = new WCFServiceClient()) {
         var feedItems = ws.GetFeedList(); var channelRow =; channelRow.title = Common.Constants.SITE_NAME; channelRow.description = Common.Constants.SITE_DESCRIPTION; channelRow.language = Common.Constants.SITE_LANGUAGE; = Common.Constants.URL;;; foreach (var item in feedItems) {
         var itemRow = dt.item.NewitemRow(); itemRow.SetParentRow(channelRow); itemRow.description = item.Description; = buildPostURL(item.URL, item.Published); itemRow.pubDate = item.Published.ToString(CultureInfo.InvariantCulture); itemRow.title = item.Title; dt.item.AdditemRow(itemRow); dt.item.AcceptChanges(); }
    var xmlString = dt.GetXml(); xmlString = xmlString.Replace("<rss>", "<?xml version=\"1.0\" encoding=\"utf-8\"?><rss version=\"2.0\">"); using (var sw = new StreamWriter(HttpContext.Server.MapPath("~/rss.xml"))) {
         sw.Write(xmlString); }
    Pretty intuitive code with one exception - I could not find a way to add the version property to the rss element, thus having to use the GetXml() method and then do a more elaborate solution instead of simply calling dt.WriteXml(HttpContext.Server.MapPath("~/rss.xml")). Overall though I find this approach to be very acceptable, but not perfect.

    Option 2 - Syndication Approach

    Not 100% satisfied with the XSD Approach mentioned above I dove into the SyndicationFeed class. Be sure to include using System.ServiceModel.Syndication; at the top of your MVC Controller. I created the same function as above, but this time utilizing the SyndicationFeed class that is built into .NET:
    private void writeRSSXML() {
         using (var ws = new WCFServiceClient()) {
         var feed = new SyndicationFeed(); feed.Title = SyndicationContent.CreatePlaintextContent(Common.Constants.SITE_NAME); feed.Description = SyndicationContent.CreatePlaintextContent(Common.Constants.SITE_DESCRIPTION); feed.Language = Common.Constants.SITE_LANGUAGE; feed.Links.Add(new SyndicationLink(new Uri(Common.Constants.URL))); var feedItems = new List<SyndicationItem>(); foreach (var item in ws.GetFeedList()) {
         var sItem = new SyndicationItem(); sItem.Title = SyndicationContent.CreatePlaintextContent(item.Title); sItem.PublishDate = item.Published; sItem.Summary = SyndicationContent.CreatePlaintextContent(item.Description); sItem.Links.Add(new SyndicationLink(new Uri(buildPostURL(item.URL, item.Published)))); feedItems.Add(sItem); }
    feed.Items = feedItems; var rssWriter = XmlWriter.Create(HttpContext.Server.MapPath("~/rss.xml")); var rssFeedFormatter = new Rss20FeedFormatter(feed); rssFeedFormatter.WriteTo(rssWriter); rssWriter.Close(); }
    On first glance you might notice very similar code between the two approaches, with one major exception - there's no hacks to make it work as intended. Between the two I am going to go live with the later approach, not having to worry about the String.Replace ever failing and not having any "magic" strings is worth it. But I will leave the decision to you as to which to implement or maybe another approach I didn't mention - please comment if you have another approach. I am always open to using "better" or alternate approaches. Now that the WCF Service is fully integrated and RSS Feeds have been added, as far as the end user view there are but a few features remaining: Caching, Searching Content, Error Pages. Stay tuned for Part 8 tomorrow.
    Nearing the end of my initial migration now in Part 6, I dove into Comment Listing, Adding New Comments and then emailing users a new comment was entered. (Other Posts: Part 1, Part 2, Part 3, Part 4 and Part 5). In Part 5, I imported the Comments, but wasn't doing anything but showing the Comment Count in the Post Title. In this post I will begin with what I did to display the comments. First off I added a new column to my PostComments to handle those Comments that were not approved or are pending approval (thinking about spam bots in particular). After adding that new column, I created a Stored Procedure to return the Comments for a given Post, some (maybe most) might find creating a Stored Procedure to simply return one Table is unnecessary, but I find it helps keep my C# much cleaner by adding that layer between my SQL Database and my C# code. [sql] CREATE PROCEDURE [dbo].[getPostCommentsSP] (@PostID INT) AS SELECT dbo.PostComments.Modified, dbo.PostComments.Body, dbo.PostComments.Name FROM dbo.PostComments WHERE dbo.PostComments.PostID = @PostID AND dbo.PostComments.IsApproved = 1 AND dbo.PostComments.Active = 1 [/sql] I proceeded to add a new function in my PostFactory to return the converted List collection:
    public List<Objects.Comment> GetCommentsFromPost(int PostID) {
         using (var eFactory = new bbxp_jarredcapellmanEntities()) {
         return eFactory.getPostCommentsSP(PostID).Select(a => new Objects.Comment(a.Name, a.Body, a.Modified)).ToList(); }
    Because I had already created a SinglePost ActionResult, I simply added the one line to also include my new created Comments List Collection:
    model.Post.Comments = pFactory.GetCommentsFromPost(model.Posts[0].ID); ]]>
    Since the main listings do not display the Comments, just the count, it was necessary to have a unique ActionResult. That being said, I did reuse my PartialView I created the other night, only adding to it:
    if (@Model.Comments != null) {
         <div id="PostListing"> <div class="Title"> <h2>Comments</h2> </div> @foreach (var comment in @Model.Comments) {
         <div class="Comment"> <div class="Title"> <h3>@comment.Name - @comment.PostTime</h3> </div> <div class="Body"> @comment.Body </div> </div> }
    </div> }
    Because the Comments are otherwise null I can reuse the PartialView. After adding in all of the CSS Styles: [caption id="attachment_2039" align="aligncenter" width="300"]Comments Listing <span classin MVC4 Site" width="300" height="165" class="size-medium wp-image-2039" /> Comments Listing in MVC4 Site[/caption] Next on the list of things I wanted to accomplish is adding a form below the Comments Listing. Adding a pretty basic form for web developers is pretty trivial, however here is the code I am using:
    if (@ViewBag.SinglePost != null) {
         <div class="CommentForm"> <input type="hidden" name="PostID" value="@Model.ID" /> <div class="Title"> <h2>Add a Comment</h2> </div> <div class="Fields"> <input type="text" id="PersonName" name="PersonName" class="k-textbox" required placeholder="Name" /><span class="requiredField">*</span><br/><br/> <input type="text" id="EmailAddress" name="EmailAddress" class="k-textbox" required placeholder="Email Address" /><span class="requiredField">*</span><br/><br/> </div> <div class="Body"> <textarea class="k-textbox" id="Body" name="Body" cols="500" maxlength="9999" wrap="soft" rows="5" placeholder="Enter comment here"></textarea><span class="requiredField">*</span><br/> </div> <div class="Submit"> <button class="k-button" type="submit">Submit Comment >></button> </div> </div> }
    And the following line right above the CommentListing posted above:
    @using (Ajax.BeginForm("AddComment", "Home", new AjaxOptions {
        UpdateTargetId = "PostListing"}
    )) {
    In my PostFactory I added the following code, note the line about auto-approving if the name/email combination had previously been approved just like WordPress does:
    public void addComment(int PostID, string name, string email, string body) {
         using (var eFactory = new bbxp_jarredcapellmanEntities()) {
         var comment = eFactory.PostComments.Create(); comment.Active = true; comment.Body = body; comment.Created = DateTime.Now; comment.Email = email; comment.Modified = DateTime.Now; comment.Name = name; comment.PostID = PostID; comment.IsApproved = eFactory.PostComments.Any(a => a.Name == name && a.Email == email && a.Active && a.IsApproved); eFactory.PostComments.Add(comment); eFactory.SaveChanges(); }
    A feature of WordPress I realized I enjoyed was the fact it emailed me when a new comment was entered in the system. So I figured I would add the same functionality to my MVC app. One thing I should note, this is far from ideal code. Think of a larger site, with hundreds or thousands of comments from users. The user would have to wait until all of the emails were sent and then return the user to the post they added their comment to. A better approach would be to offload this potentially long running task to a Windows Service - a feature I will be adding shortly.
    // If the comment wasn't approved don't bother continuing to process if (!comment.IsApproved) {
         return; }
    // Grab the existing approved comments and exclude the nearly added comment var existingComments = eFactory.getPostCommentsSP(PostID).Where(a => a.ID != comment.ID).ToList(); // Make sure there is at least 1 other comment in the system if (existingComments.Count == 1) {
         return; }
    // Grab the Post to get the Post Title var post = eFactory.Posts.FirstOrDefault(a => a.ID == PostID); // Populate the Title and Body sections var Title = "Comment: \"" + post.Title + "\""; var Body = String.Format("The following comment by {
    was added:" + System.Environment.NewLine + "{
    ", comment.Name, comment.Body); // Iterate through all of comments individually so as to not reveal other's email addresses to each other using (var smtpClient = new SmtpClient()) {
         foreach (var existingComment in existingComments) {
         smtpClient.Send(ConfigurationManager.AppSettings["EMAILADDRESS_CommentNotification"], existingComment.Email, Title, Body); }
    So what is next? Implementing the WCF Service previously mentioned and the Windows Service mentioned above. This will allow me to easily create Windows Phone 8, Windows 8 Store apps or heck even a command line version if there was demand. More to come...
    Continuing onto Part 5 of my migration from WordPress to MVC 4, I dove into Content, Comments and Routing tonight. (Other Posts: Part 1, Part 2, Part 3 and Part 4). First thing I did tonight was add a new route to handle pages in the same way WordPress does (YYYY/MM/DD/) for several reasons, though my primary reason is to retain all of the links from the existing WordPress site - something I'd highly suggest you consider doing as well. As noted the other night, your MVC Routing is contained in your Global.asax.cs file. Below is the route I added to accept the same format as WordPress:
    routes.MapRoute("ContentPageRoute", "{
    ", new {
        controller = "Home", action = "ContentPage"}
    ); ]]>
    Be sure to put it before the Default Route otherwise the route above will not work. After I got the Route setup, I went back into my _Layout.cshtml and updated the header links to pull from a SQL Table and then return the results to the layout:
    <div class="HeaderMenu"> <nav> <ul id="menu"> <li>@Html.ActionLink("home", "Index", "Home")</li> @{
         foreach (bbxp.lib.Objects.MenuItem menuItem in @Model.Base.MenuItems) {
         <li>@Html.ActionLink(@menuItem.Title, "ContentPage", "Home", new {
        pagename = @menuItem.URLName}
    , null)</li> }
    </ul> </nav> </div> ]]>
    Further down the road I plan to add a UI interface to adjust the menu items, thus the need to make it programmatic from the start. Next on the list was actually importing the content from the export functionality in WordPress. Thankfully the structure is similar to the actual posts so it only took the following code to get them all imported:
    if (item.post_type == "page") {
         var content = eFactory.Contents.Create(); content.Active = true; content.Body = item.encoded; content.Created = DateTime.Parse(item.post_date); content.Modified = DateTime.Parse(item.post_date); content.PostedByUserID = creator.ID; content.Title = item.title; content.URLSafename = item.post_name; eFactory.Contents.Add(content); eFactory.SaveChanges(); continue; }
    With some time to spare, I started work on the Comments piece of the migration. Immediately after the Post creation in the Importer, I added the following to import all of the comments:
    foreach (var comment in item.GetcommentRows()) {
         var nComment = eFactory.PostComments.Create(); nComment.Active = true; nComment.Body = comment.comment_content; nComment.Created = DateTime.Parse(comment.comment_date); nComment.Modified = DateTime.Parse(comment.comment_date); nComment.PostID = post.ID; nComment.Email = comment.comment_author_email; nComment.Name = comment.comment_author; eFactory.PostComments.Add(nComment); eFactory.SaveChanges(); }
    And now that there were actual comments in the system, I went back into my partial view for the Posts and added the code to display the Comments Link and Total properly:
    <div class="CommentLink"> @{
         object commentLink = @bbxp.mvc.Common.Constants.URL + @Model.PostDate.Year + "/" + @Model.PostDate.Month + "/" + @Model.PostDate.Day + "/" + @Model.URLSafename; <h4><a href="@commentLink">@Model.NumComments @(Model.NumComments == 1 ? "Comment" : "Comments")</a></h4> }
    </div> ]]>
    After getting the Comments Count displayed I wanted to do some refactoring on the code up to now. Now that I've got a pretty good understanding of MVC architecture I started to create Base objects. The commonly pulled in data for instance (Tag Cloud, Menu Items, Archive List etc.) I now have in a BaseModel and pulled in a BaseController. After which all Controllers inherit. I cut down on a good chunk of code and feel pretty confident as time goes on I will be able to expand upon this baseline architecture very easily. [caption id="attachment_2032" align="aligncenter" width="300"]Migration Project <span classas of Part 5" width="300" height="115" class="size-medium wp-image-2032" /> Migration Project as of Part 5[/caption] So what is on the plate next? Getting the Comments displayed, the ability to post new comments and in the back end email people upon a new comment being entered for a particular post.
    Continuing onto Day 4 of my Migration to MVC 4 (Part 1, Part 2 and Part 3) contrary to what I suggested I would focus on last night, I dove into getting all of the MVC Routing in place so the site could start functioning like my existing WordPress site does. Having only been really doing MVC for a month in half in between MonoDroid, MonoTouch and Windows Phone I hadn't had time to really dive into Routing, which I had been excited about implementing in a future project. One of the first hurdles I ran into was leaving the default routing in place. Frustratingly this caused all of my new routes to not be processed correctly, keep that in mind when you first dive into MVC. For those also early on in their MVC Path, your routing is defined in the Global.asax.cs file. Your default route is defined in the RegisterRoutes function:
    routes.MapRoute( name: "Default", url: "{
    ", defaults: new {
        controller = "Home", action = "Index", id = UrlParameter.Optional}
    ); ]]>
    If you were going to have a URL like this:
    routes.MapRoute("PostsRoute", "{
    ", new {
         controller = "Home", action = "Posts" }
    , new {
         year = @"\d+" }
    ); ]]>
    I took it one step further by forcing the year parameter to be a number. To make use of the this route, just make sure your controller, Home has an ActionResult called Posts with both a year and month parameters like so:
    public class HomeController : Controller {
         public ActionResult Posts(int year, string month) {
         .... }
    So now that the routing is working like I wanted tomorrow night is a focus on getting the content pages imported, routed and displayed properly. Being the planner I am, I figured I would map out the next few evenings:
    1. Tuesday - Content Pages (Import, Routing, Display)
    2. Wednesday - Comments (Import, Display, Adding)
    3. Thursday - Create the WCF Service Layer
    4. Friday - Add Login and Basic Add Post Form
    5. Sunday - Edit Post/Content Support
    Continuing my series on Migrating from WordPress to MVC4 (Part 1 and Part 2). Today I worked on getting the right hand side bar and importing Tags from the export mentioned in Part 1. Where did I begin? Based on the original import I made last Friday night, I created a Stored Procedure to create the right hand side "Archives List", for those curious here is the SQL: [sql] SELECT DATEPART(YEAR, dbo.Posts.Created) AS 'PostYear', DATENAME(MONTH, dbo.Posts.Created) AS 'PostMonth', (SELECT COUNT(*) FROM dbo.Posts postsCount WHERE DATENAME(MONTH, postsCount.Created) = DATENAME(MONTH, dbo.Posts.Created) AND DATEPART(YEAR, postsCount.Created) = DATEPART(YEAR, dbo.Posts.Created) AND postsCount.Active = 1) AS 'NumPosts' FROM dbo.Posts WHERE dbo.Posts.Active = 1 GROUP BY DATENAME(MONTH, dbo.Posts.Created), DATEPART(YEAR, dbo.Posts.Created) ORDER BY DATEPART(YEAR, dbo.Posts.Created) DESC, DATENAME(MONTH, dbo.Posts.Created) [/sql] And then in the UI:
    <div class="Widget"> <div class="Title"> <h3>Archives</h3> </div> <div class="Content"> @foreach (var item in @Model.ArchiveItems) {
         var baseURL = "" + @item.Year + "/" + @item.Month + "/"; <div class="ArchiveItem"> <a href="@baseURL">@item.Month @item.Year (@item.PostCount)</a> </div> }
    </div> </div> ]]>
    After all is said and done (also included are the SQL stored Links): [caption id="attachment_2004" align="aligncenter" width="100"]Archived List and Links List Archived List and Links List[/caption] At this point I needed to do a re-import of the data as I had only imported the Posts. In addition I added support to import the Categories for Posts while I was at it. If you're referencing the code in this series I added the following block immediately after the Post row is added. This block parses and imports the Tags and Categories:
    foreach (var tag in item.GetcategoryRows()) {
         if (tag.domain == "post_tag") {
         var existingTag = eFactory.Tags.FirstOrDefault(a => a.Description == tag.category_Text); if (existingTag == null) {
         existingTag = eFactory.Tags.Create(); existingTag.Active = true; existingTag.Created = DateTime.Now; existingTag.Description = tag.category_Text; existingTag.Modified = DateTime.Now; eFactory.Tags.Add(existingTag); eFactory.SaveChanges(); }
    var relationalRow = eFactory.Posts2Tags.Create(); relationalRow.Active = true; relationalRow.Created = post.Created; relationalRow.Modified = post.Created; relationalRow.PostID = post.ID; relationalRow.TagID = existingTag.ID; eFactory.Posts2Tags.Add(relationalRow); eFactory.SaveChanges(); }
    else if (tag.domain == "category") {
         var existingCategory = eFactory.PostCategories.FirstOrDefault(a => a.Description == tag.category_Text); if (existingCategory == null) {
         existingCategory = eFactory.PostCategories.Create(); existingCategory.Active = true; existingCategory.Created = DateTime.Now; existingCategory.Description = tag.category_Text; existingCategory.Modified = DateTime.Now; eFactory.PostCategories.Add(existingCategory); eFactory.SaveChanges(); }
    var relationalRow = eFactory.Posts2Categories.Create(); relationalRow.Active = true; relationalRow.Created = post.Created; relationalRow.Modified = post.Created; relationalRow.PostID = post.ID; relationalRow.PostCategoryID = existingCategory.ID; eFactory.Posts2Categories.Add(relationalRow); eFactory.SaveChanges(); }
    Now that the Tags and Categories are imported into the SQL Server, I wanted to recreate the Tag Cloud feature that WordPress offers and Telerik offers in their ASP.NET WebForms Suite. At a base level all a Tag Cloud really does is based on the highest count of items, create a larger link of the Tag and get decreasingly smaller as the occurrences decrease. So for those only here to check out how I accomplished it, let's dive in. First I created a Stored Procedure to get the Top 50 used Tags: [sql] SELECT TOP 50 (SELECT COUNT(*) FROM dbo.Posts2Tags WHERE dbo.Posts2Tags.TagID = dbo.Tags.ID) AS 'NumTags', dbo.Tags.Description FROM dbo.Tags WHERE dbo.Tags.Active = 1 ORDER BY (SELECT COUNT(*) FROM dbo.Posts2Tags WHERE dbo.Posts2Tags.TagID = dbo.Tags.ID) DESC, dbo.Tags.Description ASC [/sql] And then in my Controller in the MVC 4 App:
    private List<lib.Objects.TagCloudItem> processTagCloud(List<lib.Objects.TagCloudItem> tagItems) {
         var startingLevel = 10; for (var x = 0; x < tagItems.Count; x++) {
         tagItems[x].CSSClassName = "TagItem" + startingLevel; if (startingLevel > 1) {
         startingLevel--; }
    return tagItems.OrderBy(a => a.Name).ToList(); }
    Basically I created 10 "Levels" of different Tag Cloud sizes and based on the top 9 I use the larger sizes. And the associated CSS: [css] /* Tag Cloud Items */ .sideBar .Widget .Content .TagItem1 {
         font-size: 8pt; }
    .sideBar .Widget .Content .TagItem2 {
         font-size: 9pt; }
    .sideBar .Widget .Content .TagItem3 {
         font-size: 10pt; }
    .sideBar .Widget .Content .TagItem4 {
         font-size: 11pt; }
    .sideBar .Widget .Content .TagItem5 {
         font-size: 12pt; }
    .sideBar .Widget .Content .TagItem6 {
         font-size: 16pt; }
    .sideBar .Widget .Content .TagItem7 {
         font-size: 20pt; }
    .sideBar .Widget .Content .TagItem8 {
         font-size: 24pt; }
    .sideBar .Widget .Content .TagItem9 {
         font-size: 28pt; }
    .sideBar .Widget .Content .TagItem10 {
         font-size: 32pt; }
    [/css] And the Razor View:
    <div class="Widget"> <div class="Title"> <h3>Tags</h3> </div> <div class="Content"> @foreach (bbxp.lib.Objects.TagCloudItem tag in @Model.TagCloudItems) {
         var url = "" + @tag.Name + "/"; <a href="@url" class="@tag.CSSClassName">@tag.Name</a> }
    </div> </div> ]]>
    After implementing the SQL, CSS and HTML I got it working just as I wanted: [caption id="attachment_2008" align="aligncenter" width="116"]bbxp TagCloud bbxp TagCloud[/caption] Next up on the list is to import the comments, display them and create a comments form.
    Continuing from last night's post, Part 1 of Migrating WordPress to MVC4 I spent some time today working on the visual display of the migrated posts. Like most programmers using WordPress, I utilize the excellent Syntax Highlighter JavaScript/CSS library for all of my code blocks. The caveat with this, those tags now exist all throughout my posts going back a year or more at this point. Luckily, my Regular Expression skills have gone up considerably with projects like my Windows Phone 8 app, jcCMAP that utilizes XPath and Regular Expressions extensively. So where do you begin? Like many migrations you have a choice, do you migrate the data as is into the new structure or do you manipulate, like in this case with the tags, the actual tags into something that is preprocessed? Being a firm believer in storing data in as bare of a form as possible and then in my business and presentation layers worrying about the UI, I am choosing to leave the tags as they exist. Luckily, the tags follow a very easy to parse syntax with brackets and the name of the language. First steps from last night were to do some refactorization of the Data Layer and split it into a true 3 tier architecture. I first created a PostFactory class to interface with the EntityFramework in my Windows Class Library:
    public class PostFactory : IDisposable {
         public List<Objects.Post> GetPosts(DateTime startDate, DateTime endDate) {
         using (var eFactory = new bbxp_jarredcapellmanEntities()) {
         return eFactory.getPostListingSP(startDate, endDate).Select(a => new Objects.Post(a.ID, a.Created, a.Title, a.Body)).ToList(); }
    .... ]]>
    This block grabs all of the posts for a given date range from the getPostListingSP Stored Procedure and then using LINQ does a translation to a Post object that resides in my PCL library. The Post object exists in the Portable Class Library (PCL) to be utilized by the MVC4 app and the eventual Windows Phone 8 app. Planning ahead and doing things right from the get go will save you time - don't rush your initial architecture, you'll pay for it later. Next I create my Post Object that encapsulates the properties I want exposed to the clients (MVC4 App and Windows Phone). Some might find it silly to not simply reuse the EntityFramework Complex Type object that the stored procedure mentioned above returns. I find that approach to be a lack of separation of concerns and crossing tiers between the data and UI layers. For a simple site I might overlook it, but for 99% of the things I do, I always have an object that acts as a middle man between the data and UI layers. Now onto the code:
    public class Post {
         // Propertiess public int ID {
         get; set; }
    public DateTime PostDate {
         get; set; }
    public string Title {
         get; set; }
    public string Body {
         get; set; }
    public string PostBy {
         get; set; }
    public Post(int id, DateTime postDate, string title, string body) {
         ID = id; PostDate = postDate; Title = title; Body = parsePost(body); }
    // Parse the SyntaxHighlighter Tags and replace them with the SyntaxHighlighter <pre> tags private static string parsePost(string content) {
         var matches = Regex.Matches(content, @"\[(.*[a-z])\]"); foreach (Match match in matches) {
         var syntaxTag = new SyntaxTag(match.Value); if (!syntaxTag.IsParseable) {
         continue; }
    if (syntaxTag.IsClosingTag) {
         content = content.Replace(syntaxTag.FullTagName, "</pre>"); }
    else {
         content = content.Replace(syntaxTag.FullTagName, "<pre class=\"brush: " + syntaxTag.NameOnly + ";\">"); }
    return content; }
    Pretty stock code, the only "interesting" code is the regular expression to grab all of the SyntaxHighlighter. For those doing Regular Expressions, I find it incredibly useful to use a tool like Regex Hero to build your Regular Expressions since you can test input on the fly without having to constantly rebuild your code and test. Next on the "to code" list was the SyntaxTag object.
    public class SyntaxTag {
         public SyntaxTag(string value) {
         FullTagName = value; }
    public string NameOnly {
         get {
         return FullTagName.Replace("[/", "").Replace("[", "").Replace("]", ""); }
    public bool IsClosingTag {
         get {
         return FullTagName.StartsWith("[/"); }
    public string FullTagName {
         get; private set; }
    // Acceptable syntaxtags (there are more, but this is all I used previously) private enum SYNTAXTAGS {
         csharp, xml, sql, php, c, bash, shell, cpp, js, java, ps, plain }
    public bool IsParseable {
         get {
         SYNTAXTAGS tag; return Enum.TryParse(NameOnly, out tag); }
    Again, a pretty basic class. Based on the full tag, it provides a clean interface to the Post class (or others down the road) without mucking up other areas of code. One thing I did do that many might find strange is to use an enumeration to eliminate false positives. I am a huge fan of strongly typed code (thus why I shy away from languages that aren't) so it made perfect sense to again utilize this approach. As I utilize new tags for whatever reason, the logic is contained only here so I won't be hunting around for where to update it. Another less "clean" approach would be to put these in the web.config or in your SQL Database. Though I find both of those more performance intensive and not necessary in this case. Now that the business and data layers are good to go for the time being, let's go back to our MVC4 App. Inside my controller the code is pretty simple still for my Index:
    public ActionResult Index() {
         var model = new Models.HomeModel(); using (var pFactory = new PostFactory()) {
         model.Posts = pFactory.GetPosts(new DateTime(2001, 1, 1), new DateTime(2013, 4, 13)); }
    ViewBag.Model = model; return View(model); }
    At the moment I don't have my WCF Service written yet so for the time being I am simply referencing the Windows Class Library mentioned above, thus why I am referencing the PostFactory class directly in the Controller. Then in my View:
    @model bbxp.mvc.Models.HomeModel @{
         ViewBag.Title = "Jarred Capellman"; }
    @foreach (var post in Model.Posts) {
         @Html.Partial("PartialPost", post) }
    <script type="text/javascript"> SyntaxHighlighter.all() </script> ]]>
    As I am looping through each post I am calling out to my Partial View, PartialPost. And for my Partial View:
    @model bbxp.lib.Objects.Post <div class="post"> <div class="Date"> <h3>@Model.PostDate.ToLongDateString()</h3> </div> <div class="Content"> <div class="Title"> <h2>@Model.Title</h2> </div> <div class="Body">@(new MvcHtmlString(@Model.Body))</div> </div> </div> ]]>
    The @(new MvcHtmlString(@Model.Body)) line is very important otherwise your HTML Tags will not be parsed as you would expect. When all is said and done I went from this last night: [caption id="attachment_1982" align="aligncenter" width="300"]End Result of an Initial Conversion End Result of an Initial Conversion[/caption] To this tonight: [caption id="attachment_1992" align="aligncenter" width="300"]After applying regular expressions to the Post Content After applying regular expressions to the Post Content[/caption] Next up is creating the WordPress Sidebar History and extending functionality of the "engine" to support single Post Views.
    In working on a new MVC4 app in my free time I had renamed the original assembly early on in development, did a clean solution, yet kept getting:
    Multiple types were found that match the controller named XYZ. This can happen if the route that services this request ('{controller}/{action}/{id}') does not specify namespaces to search for a controller that matches the request.
    Turns out, since the project name changed, doing a clean solution had zero effect. So the quick and easy solution: go to your bin folder and delete the original dll.
    Diving into MVC 4 this week and going through the default Kendo UI MVC 4 project type in Visual Studio 2012 I noticed quite a few assemblies I knew I wouldn't need for my current project, namely the DotNetOpenAuth assemblies. I removed the 6 DotNetOpenAuth.* assemblies: [caption id="attachment_1890" align="aligncenter" width="304"]MVC 4 Assemblies MVC 4 Assemblies[/caption] In addition you'll need to remove the Microsoft.Web.WebPages.OAuth reference as well. To my surprise, upon building and debugging the new project I received the following exception: [caption id="attachment_1891" align="aligncenter" width="550"]MVC 4 Exception - DotNetOpenAuth Not Found MVC 4 Exception - DotNetOpenAuth Not Found[/caption] I double checked my packages.config and web.config config files for any reference, to no avail. As a last resort I deleted my bin and obj folders, rebuilt the solution and sure enough it started without any issues. Hopefully that helps someone out.