latest posts

Over the last week I have spent significant time in flushing out the UWP Client for bbxp. In doing so I have run into several interesting problems to overcome. This blog post will cover the ones I have solved and those that I uncovered that are much larger scope than a simple UWP client.

Problem One - Displaying HTML

A simple approach to displaying the content that is returned from the WebAPI service would be to simply use a WebView control in XAML and call it a day.

The problem with this approach is that in my case I am using two css files, one for bootstrap and one for my custom styles. Without the CSS styles being included I would have to either accept that the syling would be inconsistent or come up with a way to inject the CSS into the response item.

I chose the later. In diving into this problem, one approach would be to pre-inject the CSS string for every item. In looking at the file sizes, even minified bootstrap.min.css is 119kb and my own style file minified is 3kb. That is an extra 122kb per response, returning 10 posts that is over a mb of needless data. Another approach would be to return the CSS once along with the content strings. This is problematic as it would require other elements (Content, Searching and Archives) to also implement this approach. After some playing around with these approaches I came up with what I feel is the best approach:

  1. Added a new Controller to the MVC application to return a single string with both CSS files
  2. For every request on displaying either Content or a Post, see if the CSS file string is stored locally otherwise go out to the MVC app, download it and store it. Finally - inject the CSS string with the content string and display the XAML in a WebView
Pretty simple approach and efficient so lets dive into the code.

To start, I created a user control in the UWP app called HTMLRenderView to handle the NavigateToString event firing and allow the XAML to be able to bind with MVVM the string with a custom property called ContentBody like so:

This way whether I am loading several WebView controls in a list or just a single instance my code behind on the pages is clean.

Problem Two - Matching Styles between Platforms

In doing this "port" to UWP I quickly realized maintaining CSS and XAML styles is going to be problematic at best, especially when adding in Xamarin Forms ports to iOS and Android down the road. I have run into this situation before at work on various projects, but in those scenarios the application styles were pretty static as opposed my blog where I update the syling fairly regularly. Also thinking about others using this platform or at least as a basis for their own platform.

My first inclination would be to do something akin to TypeScript where it would compile down to the native syntax of each platform, CSS and XAML in my case. For the time being I will be adding this to my bucket list to investigate a solution down the road.

Conclusion

As of this writing there are only two features left in the UWP Client: Archives and External Link handling. There is additional styling and optimization work to be done, but overall once those two features are added I will begin on the Xamarin Forms port for iOS and Android clients.

All of the code thus far is committed on GitHub.
TAGS
none on this post

Introduction

In case you missed the other days of this deep dive:
Day 1 Deep Dive
Day 2 Deep Dive with MongoDB
Day 3 Deep Dive with MongoDB
Day 4 Deep Dive with MongoDB
Day 5 Deep Dive with MongoDB
Day 6 Deep Dive with Mongoose
Day 7 Deep Dive with Clustering
Day 8 Deep Dive with PM2
Day 9 Deep Dive with Restify
Day 10 Deep Dive with Redis
Day 11 Deep Dive with Redis and ASP.NET Core

As mentioned on Friday I wanted to spend some time turning my newly acquired Node.js and Redis knowledge into something more real-world. Learning when using Node.js (and Redis) and when not to over the last 2 weeks has been really interesting especially coming from an ASP.NET background where I had traditionally used it to solve every problem. While ASP.NET is great and can solve virtually any problem you throw at it, as noted by my deep dives it isn't always the best solution. In particular there are huge performance issues as you scale an ASP.NET/Redis pairing verses Node.js/Redis. Wanting to get my bbxp blog platform that powers this site back into a service oriented architecture as it was a couple years ago with a WCF Service and possibly a micro-service architecture, what better time than now to implement a caching layer using Node.js/Redis.

This blog post will detail some design decisions I made over the weekend and what I feel still needs some polishing. For those curious the code currently checked into the github repo is far from production ready. Until it is production ready I won't be using the code to power this site.

Initial Re-architecting

The first change I made to the solution was to add an ASP.NET Core WebAPI project, a .NET Standard Business Layer project and a .NET Standard Data Layer project for use with the WebAPI project. Having had the foresight earlier this year when I redid the platform to utilize ASP.NET Core everything was broken out so the effort wasn't as huge as it could have been if all of the code was simply placed into the controllers of the MVC app.

One thing of this restructuring that was new was playing around with the new Portable Class Library targeting .NET Standard. From what I have gathered this is the future replacement of the crazy amount of profiles we have been using the last three years - Profile78 is a lot more confusing than .NET Standard 1.4 in my opinion. It took some time finding the most up to date table detailing what platforms are on what version, but for those also looking for a good reference, please bookmark the .NET Standard Library Roadmap page. For this project as of this writing UWP does not support higher than 1.4 so I targeted that version for the PCL project. From what I have read 1.6 support is coming in next major UWP Platform NuGet package update.

Redis Implementation

After deep diving into Redis with ASP.NET Core in Day 11 and Node.js in Day 10, it became pretty clear Node.js was a much better choice for speed as the number of requests increased. Designing this platform to truly be scalable and getting experience designing a very scalable system with new technology I haven't messed with are definitely part of this revamp. With caching as seasoned developers know if a tricky slope. One could simply turn on Output Caching for their ASP.NET MVC or WebForms app - which wouldn't benefit the mobile, IoT or other clients of the platform. In a platform agnostic world, this approach can be used, but I shy away from using that and calling it a day. I would argue that native apps and other services are hit more than a web app for a platform like bbxp in 2016.

So what are some other options? For bbxp, the largest part of the request time server side is pulling the post data from the SQL Server database. I had previously added in some dynamically generated normalized tables when post content is created, updated or deleted, but even still this puts more stress on the database and requires scaling vertically as these tables aren't distributed. This is where a caching mechanism like Redis can really help especially in the approach I took.

A more traditional approach to implementing Redis with ASP.NET Core might have been to simply have the WebAPI service do a check if the Redis database had the data cached (ie the key) and if not push it into the Redis database and return the result. I didn't agree with this approach as it needlessly hit the main WebAPI service for no reason if it was in the cache. A better approach in my mind is to implement it a separate web service, in my case Node.js with restify and have that directly communicate with Redis. This way best case, you get the speed and scalability of Node.js and Redis without ever hitting the primary WebAPI service or SQL Servers. Worse case Node.js returns extremely quickly that the key was not found and then makes a second request to the WebAPI Service to not only query the data from SQL Server, but also fire a call to Redis to add the data to the cache.

An important thing to note here is the way I did my wrapping of the REST service calls, each consumer of the service does not actually know or care which data source the data came from. In the nearly seven years of doing Service Oriented Architectures (SOA), the less business logic work being done client side even as simple as making a second call to a separate web service is too much. The largest part of that is consistency and maintainability of your code. In a multi-platform world you might have ASP.NET, Xamarin, UWP and IoT code bases to maintain with a small team or worse just a single person. Putting this code inside the PCL as I have done is the best approach I have found.

That being said, lets dive into some C# code. For my wrapper of Redis I chose to a pretty straight forward simple approach. Taking a string value for the key and then accepting a type of T, which the helper function will automatically convert into JSON to be stored in Redis:

public async void WriteJSON(string key, T objectValue) { var value = JsonConvert.SerializeObject(objectValue, Formatting.None); value = JToken.Parse(value).ToString(); await db.StringSetAsync(Uri.EscapeDataString(key), value, flags: CommandFlags.FireAndForget); }

A key point here is the FireAndForget call so we aren't delaying the response back to the client while writing to Redis. A better approach for later this week might be to add in Azure's Service Bus or a messaging system like RabbitMQ to handle if the key couldn't be added for instance if the Redis server was down. In this scenario, the system would work with my approach, but the scaling would be hampered and depending on the number of users hitting the site and or the server size itself, this could be disasterous.

Node.js Refactoring

With the additional of several more routes being handled by Node.js than in my testing samples, I decided it was time to refactor the code to cut down on the duplicate redis client code and handling of null values. At this point I am unsure if my node.js code is as polished as it could be, but it does in fact work and handles null checks properly.

My dbFactory.js with the refactored code to expose a get method that handles null checking and returning the json data from redis:

module.exports = function RedisFactory(key, response) { var client = redis.createClient(settings.REDIS_DATABASE_PORT, settings.REDIS_DATABASE_HOSTNAME); client.on("error", function(err) { console.log("Error " + err); }); client.get(key, function(err, reply) { if (reply == null) { response.writeHead(200, { 'Content-Type': 'application/json' }); response.end(""); return response; } response.writeHead(200, { 'Content-Type': 'application/json' }); response.end(reply); return response; }); };

With this refactoring, my actual route files are pretty simple at least at this point. Below is my posts-router.js with the cleaned up code utilizing my new RedisFactory object:

var Router = require('restify-router').Router; var router = new Router(); var RedisFactoryClient = require("./dbFactory"); function getListing(request, response, next) { return RedisFactoryClient("PostListing", response); }; function getSinglePost(request, response, next) { return RedisFactoryClient(request.params.urlArg, response); }; router.get('/node/Posts', getListing); router.get('/node/Posts/:urlArg', getSinglePost); module.exports = router;

As one can see, the code is much simplified over what would have quickly become very redundant bloated code had I kept with my unfactored approach in my testing code.

Next up...

Tomorrow night I hope to start implementing automatic cache invalidation and polishing the cache entry in the business layer interfacing with Redis. With those changes I will detail out the approach with pros and cons to them. For those curious, the UWP client will become a fully supported client along with iOS and Android clients via Xamarin Forms. Those building the source code will see a very early look to getting the home screen posts pulling down with a UI that very closely resembles the MVC look and feel.

All of the code for the platform is committed on GitHub. I hope to begin automated builds like I setup with Raptor and setup releases as I add new features and continue making the platform more generic.
TAGS
none on this post

Introduction

In case you missed the other days:
Day 1 Deep Dive
Day 2 Deep Dive with MongoDB
Day 3 Deep Dive with MongoDB
Day 4 Deep Dive with MongoDB
Day 5 Deep Dive with MongoDB
Day 6 Deep Dive with Mongoose
Day 7 Deep Dive with Clustering
Day 8 Deep Dive with PM2
Day 9 Deep Dive with Restify
Day 10 Deep Dive with Redis

I was originally going to deep dive into AWS tonight, but the excitement over redis last night had me eager to switch gears a bit and get redis up and running in ASP.NET Core.

Prerequisites

I will assume you have downloaded and installed the redis server. If you're on Windows, you can download the Windows port or if you're on Linux.

Setting it up in ASP.NET

Unsure of the "approved" client, I searched around on redis's official site for some clients to try out. Based on that list and nuget, I chose to at least start with StackExchange.Redis. To get going simply issue a nuget command:

Install-Package StackExchange.Redis

Or search in NuGet for StackExchange.Redis. As of this writing I am using the latest version, 1.1.605.

To get a redis database up and running in ASP.NET was extremely painless, just a few connection lines and then an async call to set the key/value:

private static ConnectionMultiplexer redis; private IDatabase db; [HttpGet] public async Task Get(int id) { if (redis == null) { redis = ConnectionMultiplexer.Connect("localhost"); } if (db == null) { db = redis.GetDatabase(); } await db.StringSetAsync(id.ToString(), 2, flags: CommandFlags.FireAndForget); return "OK"; }

Performance

Interested to see the performance differences between how Node.js and ASP.NET Core performed with the same test. The numbers speak for themselves:


The performance results were interesting to say the least after having pretty near identical MongoDB results. Wondering if maybe there was a Kestrel difference, I re-ran the test:


Better, but not as dramatic as I would assume.

Next up...

Seeing as how the performance wasn't anywhere close to that of Node.js, I am wondering if utilzing the DI found in ASP.NET Core would alleviate the performance issues. My original intent was to spend Saturday adding Redis into my blogging platform for caching, but as of right now I will hold off until I can figure out the reason for the huge delta. All of the code thus far is committed on GitHub.
TAGS
none on this post

Introduction

In case you missed the other days:
Day 1 Deep Dive
Day 2 Deep Dive with MongoDB
Day 3 Deep Dive with MongoDB
Day 4 Deep Dive with MongoDB
Day 5 Deep Dive with MongoDB
Day 6 Deep Dive with Mongoose
Day 7 Deep Dive with Clustering
Day 8 Deep Dive with PM2
Day 9 Deep Dive with Restify

After playing around with MongoDB I wanted to try out redis not only for it's performance comparison to MongoDB, but also for practical experience for use at work and in other projects. So tonight I will get it up and running on Windows, make connections in node.js and do some perf testing.

Prerequisites

You will need to install Redis. If you're on Windows, you can download the Windows port or if you're on Linux.

redis

To get started we need to install the npm package for redis:

npm install redis -g

Once installed, getting the redis client connecting to the server was pretty painless especially after using MongoDB previously. For the sake of this testing app I simply replaced the MongoDB dbFactory.js file like so:

var redis = require("redis"); var settings = require('./config'); var client = redis.createClient(settings.REDIS_DATABASE_POST, settings.REDIS_DATABASE_HOSTNAME); client.on("error", function (err) { console.log("Error " + err); }); module.exports = client;

Since redis uses a port/hostname split value I added two additional config options and removed the older database connection property:

module.exports = { REDIS_DATABASE_HOSTNAME: 'localhost', REDIS_DATABASE_POST: 6379, HTTP_SERVER_PORT: 1338 };

In the actual route for the test it only required a few adjustments:

var Router = require('restify-router').Router; var router = new Router(); var RedisClient = require('./dbFactory'); function respond(request, response, next) { var argId = request.params.id; RedisClient.set(argId.toString(), 2); return response.json({ message: true }); } router.get('/api/Test', respond); module.exports = router;

Performance

Interested in the performance differences I was not surprised at a direct comparison being so dramatic. I should note these were done on my Razer Blade laptop not my desktop.


When I add more functionality to provide full CRUD operations it will be interesting to really do an in-depth test comparing SQL Server, MongoDB and Redis.

Next up...

Tomorrow night I am planning on diving into Amazon Web Services (AWS) to get a good comparison to Rackspace and Azure. In particular for node.js development as I imagine node dev is more commonly done on AWS.

All of the code thus far is committed on GitHub.
TAGS
none on this post

Introduction

In case you missed the other days:
Day 1 Deep Dive
Day 2 Deep Dive with MongoDB
Day 3 Deep Dive with MongoDB
Day 4 Deep Dive with MongoDB
Day 5 Deep Dive with MongoDB
Day 6 Deep Dive with Mongoose
Day 7 Deep Dive with Clustering
Day 8 Deep Dive with PM2

On the flight back from California last night I queued up a few Node.js videos on YouTube for the flight, one of which was a presentation by Kim Trott of Netflix discussing Netflix's migration from a Java backend to node.js. I really enjoyed this presentation because it went over not only what worked great, but what didn't - something that doesn't happen too often especially from a larger company. Another highlight for me was reviewing some of the node modules Netflix uses to provide the millions of hours of content daily. One of which was restify. Restify provides a clean interface for routes without the templating and rendering that Express offered - which in my current testing fits much better.

Prerequisites

As mentioned previously, this point I am going to assume MongoDB is up and running, if you do not, check my first day post for details on how to get it up and running.

restify

To get started with restify you will need to install via:

npm install restify -g

In diving into restify I found another module that goes hand in hand, restify-router. As detailed on the aforementioned link, this module offers the ability to have more separation for all of the routes (similarly to those coming ASP.NET like myself who are used to specific *Controller.cs files per grouping of routes).
To get going with restify-router simply install it via:

npm install restify-router -g

I should mention I have removed the worker.js from the git folder for Day 9 as I am now utilizing pm2 as mentioned in yesterday's post.

Replacing the Express code with restify was pretty painless:

var restify = require('restify'); var settings = require('./config'); var testRouter = require('./test.router');  var server = restify.createServer(); testRouter.applyRoutes(server); server.listen(settings.HTTP_SERVER_PORT);

For those following my deep dive this should look very similiar to the Express code with the addition of the testRouter reference which I will go over below.
As mentioned above, I chose to utilize restify-router to split out my routes into their own files. My test.router:

var Router = require('restify-router').Router; var router = new Router(); var Post = require('./dbFactory'); function respond(request, response, next) { var argId = request.params.id; var newPost = new Post({ id: argId, likes: 2 }); newPost.save(function (err) { if (err) { return response.json({ message: err }); } return response.json({ message: true }); }); } router.get('/api/Test', respond); module.exports = router;

Similiarly there is not much difference from how the older route definitions, the only big difference is the first line requiring the restify-router module.

Next up...

As mentioned last night I am still investigating why on Windows only one worker process is getting hit verses utilizing all of them. I tested this on my i7 desktop and had the same results as my 2014 Razer Blade. I hope to further deep dive into restify tomorrow night and hopefully resolve the weird scaling issue I'm noticing.

All of the code thus far is committed on GitHub.
TAGS
none on this post

Introduction

In case you missed the other days:
Day 1 Deep Dive
Day 2 Deep Dive with MongoDB
Day 3 Deep Dive with MongoDB
Day 4 Deep Dive with MongoDB
Day 5 Deep Dive with MongoDB
Day 6 Deep Dive with Mongoose
Day 7 Deep Dive with Clustering

Keeping with the clustering discussion from yesterday, I started to look into load balancers and other options outside of the cluster module I started utilizing yesterday to solve the scaling issue that exists outside of the box. In doing this research I came across a few solutions:

  1. http-proxy module
  2. nginx
  3. pm2
After some deep diving into other developer's comments and overall plans for my deep dive I chose pm2.

Prerequisites

As mentioned previously, this point I am going to assume MongoDB is up and running, if you do not, check my first day post for details on how to get it up and running.

pm2

To get started with pm2 you will need to install via:

npm install pm2 -g

Since pm2 has a built in clustering option, the code written last night has no purpose anymore, but thankfully I abstracted out the actual "worker" code to it's own worker.js file. That being said, all you have to do to have pm2 run your node.js code for all of your cpu cores is:

pm2 start worker.js -i 0

From there you should see pm2 kicking off a process for each of your cpu cores available. In my case, my 2014 Razer Blade laptop has 4 cores/8 threads so it kicked off 8 processes. If you wanted to limit the number of processes you can specify a different number instead of 0.

One of the neat things about pm2 is the ability to monitor the processes with a simple:

pm2 monit

Which on my machine produced:

A handy command when you're done is to issue a

pm2 stop all

command to stop all of the processes.

Next up...

Hopefully this was interesting for those following me along my deep dive in to node.js. I am excited to keep deep diving into pm2 tomorrow. An issue I was running into on Windows 10 (need to try it on Linux) is that only one process was being hit at ~65% according to pm2. Whether this is a Windows specific issue, a problem with pm2 or a problem with my code I need to further dive into.

All of the code thus far is committed on GitHub.
TAGS
none on this post

Introduction

In case you missed the other days:
Day 1 Deep Dive
Day 2 Deep Dive with MongoDB
Day 3 Deep Dive with MongoDB
Day 4 Deep Dive with MongoDB
Day 5 Deep Dive with MongoDB
Day 6 Deep Dive with MongoDB

As mentioned yesterday I wanted to abstract out the config settings and look into a way to better take advantage of the hardware at my disposal like I can with ASP.NET. Node.js being single threaded has a huge disadvantage when run on a multi-core server (as most are these days). Knowing there were work arounds I was keen to figure out what they were and then compare the "out of the box" mode verses using multiple cpus.

Prerequisites

As mentioned previously, this point I am going to assume MongoDB is up and running, if you do not, check my first day post for details on how to get it up and running.

Config

I read a couple different approaches in the Node.js world on configs. Some prefered a json configuration file as I have gotten use to ASP.NET Core while others prefered to just use a JavaScript file with module.exports to expose config values. For the sake of this test app at this point I went with the latter:

module.exports = { DATABASE_CONNECTIONSTRING: 'localhost/dayinnode', HTTP_SERVER_PORT: 1338 };

And then in my dbFactory.js:

var settings = require('./config'); mongoose.connect('mongodb://' + settings.DATABASE_CONNECTIONSTRING);

This way I can keep my code free of magic strings, while still having all of my configuration settings in one file.

Cluster

As it would turn out, Node.js had a built in module called Cluster that as the name would imply adds support for creating child processes of node.js.
To get it up and running was pretty painless, just a simply require on cluster and then a check to get the number of cpus. I took it one step further and abstracted away the actual "worker" code into the worker.js file. The server.js file now looks like this:

var cluster = require("cluster"); cluster.setupMaster({ exec: 'worker.js', silent: true }); var numCPUs = require("os").cpus().length; if (cluster.isMaster) { for (var i = 0; i < numCPUs; i++) { cluster.fork(); } cluster.on("exit", function (worker, code, signal) { cluster.fork(); }); }

In doing comparisons between the single threaded approach and the new cluster approach there wasn't a distinguishable difference, which leads me to believe at least on my 2014 Razer Blade laptop the bottleneck is the MongoDB database not node.js.

Next up...

When I get back home I hope to test this new code on my i7 desktop to see if there is any discernable difference between the cluster approach and the single threaded approach when using a MongoDB database. In addition, ensure that MongoDB is configured properly with Mongoose since the ASP.NET Core performance exceeded node.js's. All of the code thus far is committed on GitHub.
TAGS
none on this post

Introduction

In case you missed the other days:
Day 1 Deep Dive
Day 2 Deep Dive with MongoDB
Day 3 Deep Dive with MongoDB
Day 4 Deep Dive with MongoDB
Day 5 Deep Dive with MongoDB

Today's posting is a intro for myself into mongoose, a popular object modeler for node.js. This continues my deep dive into learning the node.js equivalents to what I used to in the ASP.NET world, for this post in particular in how mongoose compares to Entity Framework.

Prerequisites

As mentioned previously, this point I am going to assume MongoDB is up and running, if you do not, check my first day post for details on how to get it up and running.

Utilizing Mongoose and cleaning up the MongoDB code

As I mentioned in yesterday's post I wanted to clean up the code further in particular all of the MongoDB code. Knowing Mongoose would help in this regard I replaced the MongoDB code I had with interestingly enough less code and an object model for my "Posts" object. The process was fairly similar to a code first approach with Entity Framework so very comfortable.

Migrating all of my MongoDB code was extremely painless, even adding in the model creation code left it with just the following:

var mongoose = require('mongoose'); mongoose.connect('mongodb://localhost/dayinnode'); var postSchema = new mongoose.Schema({ id: Number, likes: Number }); var Post = mongoose.model('Post', postSchema); module.exports = Post;

What is neat is the actual usage of the Post object. In my routes.js file:

var express = require('express'); var Post = require('./dbFactory'); var router = express.Router(); router.get('/api/Test', function (request, response) { var argId = request.params.id; var newPost = new Post({ id: argId, likes: 2 }); newPost.save(function (err) { if (err) { return response.json({ message: err }); } return response.json({ message: true }); }); }); module.exports = router;

If you have followed the posts so far you will see there is no database specific connection objects or init calls cluttering up the business logic. In addition Mongoose made saving objects extremely easy by having a save function on the object.

Next up...

Overall simply adding in Mongoose has cleaned up my code, next up is the configuration component to remove the hard coded port number and database connection information. All of the code thus far is committed on GitHub..
TAGS
none on this post

Introduction

In case you missed the other days:
Day 1 Deep Dive
Day 2 Deep Dive with MongoDB
Day 3 Deep Dive with MongoDB
Day 4 Deep Dive with MongoDB

Tonight's posting is less about performance comparisons, but with getting my deep dive into a more "real-world" project. In doing these deep dives I'm learning when Node.js is a better tool to utilize verses ASP.NET in REST service scenarios. Going along this line, I really wanted to start shifting from the "hello world throw all your code in one file" and start structuring it like I would a C# project breaking out routes from the main file and abstracting out the database functionality. So without further discussion let's dive in.

Prerequisites

As mentioned previously, t this point I am going to assume MongoDB is up and running, if you do not, check my first day post for details on how to get it up and running.

Benchmarking Tool

Although I did not mention this in the intro, I made some extensive changes to the WPF Benchmarking tool to set it up to handle future testing more structured and dynamically than before. Benchmarks are now abstracted away and dynamically populating a dropdown, for those curious here is a screenshot:

Splitting out Routes

As mentioned in the intro, the main focus of tonight's deep dive is into start structuring of the app into a more real world app. The first step in my mind was to do away with the single server.js file I had been using this week especially with the MongoDB related code also mixed in there.

After some restructuring I now have two JavaScript files: server.js and routes.js.

My server.js is now just a few Express init calls:

var express = require('express'); var app = express(); app.use(require('./routes')); app.listen(1338);

Only thing left in my mind is to pull the hard coded port number (1338) into a config file or constants file.

My routes.js:

var express = require('express'); var mongojs = require('mongojs'); var dburl = 'localhost/day2innode'; var collections = ['posts']; var db = mongojs(dburl, collections); var postData = db.collection('posts'); var router = express.Router(); router.get('/api/Test', function (request, response) { var id = request.params.id; var newData = { 'id': id, 'likes': 2 }; postData.insert(newData, function (err, post) { if (err) { return response.json({ message: err }); } return response.json({ message: true }); }); }); module.exports = router;

For those keeping up with this deep dive, not too much new code here outside of defining the module.exports and express.Router. One thing I want to do in this file is to utilize mongoose to clean up the MongoDB calls and to pull the collection and database url from a configuration file similiar to the server port number in the server.js file.

Next up...

As mentioned above I am planning to start deep diving into mongoose to clean up some of the MongoDB database code I wrote the other night. Lastly, all of the code thus far is committed on GitHub.
TAGS
none on this post

Introduction

In case you missed the other days:
Day 1 Deep Dive
Day 2 Deep Dive with MongoDB
Day 2 Deep Dive with MongoDB

Tonight's posting is shifting a bit from my original plan of using JSON Storage in favor of testing MongoDB with ASP.NET Core and comparing it to SQL Server/ASP.NET Core and Node.js/MongoDB.

Prerequisites

At this point I am going to assume MongoDB is up and running, if you do not, check my first day post for details on how to get it up and running.
For this deep dive, the only new element I needed to add were the NuGet package reference for the MongoDB.Driver. Because I am using ASP.NET Core instead of 4.6.2 I needed to download 2.3.0-rc1.

Regardless, the package also installs MongoDB.Bson and MongoDB.Driver.Core NuGet packages as well.

WebAPI Code

After the NuGet package installation above, I removed the Entity Framework code and replaced it with the MongoDB that mimics what I had done in JavaScript a few nights ago (thus the huge simliarities:

[HttpGet] public async Task Get(int id) { var client = new MongoClient("mongodb://localhost"); var database = client.GetDatabase("day2innode"); var collection = database.GetCollection("posts"); var document = new BsonDocument { { "id", id }, { "likes", 2 } }; await collection.InsertOneAsync(document); return "OK"; }

Performance

After last night's SQL Server performance being pretty even with MongoDB at least in this simple test I was curious what the MongoDB driver for .NET would be like in terms of performance:

Not truly comfortable with the results, I re-ran the benchmarks several times and all were within 2% of the previous results. Interesting to say the least that .NET at least in this scenario is faster across the board than a Node.js implementation.

Next up...

Tomorrow I hope to see if maybe there are some Node.js optimizations I can apply admitting that before Tuesday night I had never messed with Node.js before.
Lastly, all of the code thus far is committed on GitHub.
TAGS
none on this post

Introduction

In case you missed the other days:
Day 1 Deep Dive
Day 2 Deep Dive with MongoDB

As mentioned in last night's post tonight I will be adding in the SQL Server/WebAPI comparison to the MongoDB/Node.js work I did last night.

Prerequisites

If you're coming from a C# background you probably already have SQL Server 2016 Express, but in case you don't you can download it here.

SQL Server setup

Knowing SQL Server got JSON Storage options and is arguably the best relational database available, I wanted to try both a traditional relational test and a JSON/Document storage test.

For the traditional test I executed the following SQL:

CREATE TABLE POSTS (  ID int PRIMARY KEY NONCLUSTERED,   Likes INT )

WebAPI/EntityFramework Core

With EntityFramework Core coming out of RC back in June I wanted to utilize it as opposed to the now "legacy" 6.x version most C# developers have gotten used to.

For this demo I chose to use a Code First Approach, defining my POCO as such:

public class Posts { [Key] public int ID { get; set; } public int Likes { get; set; } }

The actual Entity Framework saving code is as follows:

[HttpGet] public async Task Get(int id) { var post = new Posts { Likes = id }; _context.Posts.Add(post); await _context.SaveChangesAsync(); return "OK"; }

In addition I am utilizing the DI built into ASP.NET Core by adding in a call inside the startup.cs:

public void ConfigureServices(IServiceCollection services) { var connection = @"Server=localhost;Database=DayInNode;user id=sa;password=raptor;"; services.AddDbContext(options => options.UseSqlServer(connection)); services.AddMvc(); }

Performance

Just comparing a traditional relational approach to MongoDB I figured would be dramatically different considering how much faster Node.js handled requests, however even after running multiple tests it was pretty even:


Next up...

Tomorrow I hope to try this same test, but with the new JSON/Document Storage option that SQL Server 2016 added.
Lastly, all of the code thus far is committed on GitHub.
TAGS
none on this post

Introduction

As mentioned in yesterday's post and continuing my deep dive, tonight I will be deep diving into MongoDB specifically in conjunction with Node.js.

Prerequisites

You will need the following:
  1. MongoDB (3.2.9 was the latest available at the time of this writing)
  2. Run npm install mongojs in your Node.js project folder

MongoDB Deep-Dive

A few years back I read the great book, Seven Databases in Seven Weeks in addition to attending a presentation from David Makogon at a local .NET meetup a couple years ago that introduced me to the world of NoSQL. Having spent my career (and pre-career) in a "traditional" relational database I found both sources of information fascinating, in particular the Neo4j graph database. Anyone who has ever done relationship trees in SQL Server/Oracle/MySQL knows how much of a pain it becomes at more than 2 levels deep.

In MongoDB specifically, the idea is to think in collections which are similar to Tables in a relational database. To get started, we'll need to start MongoDB. This may not be the best approach, but this will work. Simply goto C:\Program Files\MongoDB\Server\3.2\bin in a command prompt and type mongod. Upon executing that command, you will get a screenshot similiar to below:

Afterwards we will need to make our "collection", in my case I called mine day2innode with the mongo day2innode command, afterwards you should get a similiar output like below:

From there if you do a show collections command it should return nothing as the "table" is empty (effectively doing a SELECT * FROM Sys.Tables like in SQL Server in my understanding). In the screenshot above I inserted data into the posts collection and then returned it back to confirm it writing successfully..

So now we need to write some Node.js code to write into our new MongoDB. I found using the MongoJS module extremely helpful. Taking the basis of what I had done last night, adding an argument to the route and then writing that to MongoDB was a good first step I thought. So let us dive into the code:

var dburl = 'localhost/day2innode'; var collections = ['posts']; var express = require('express'); var mongojs = require('mongojs'); var app = express(); app.get('/api/Test', function (req, res) { var db = mongojs(dburl, collections); var id = req.param('id'); var postData = db.collection('posts'); var newData = { 'id': id, 'likes': 2 }; postData.insert(newData, function(err, post) { if (err) { db.close(); return res.json({ message: err }); } db.close(); return res.json({ message: true }); }); }); app.listen(1338);

As you can see a few new additions to the code base all pretty much related to MongoJS.

Performance

Seeing as how WebAPI is traditionally paired to a relational database (mainly SQL Server), but knowing that SQL Server 2016 added in JSON Storage I wanted to do a head to head comparison. For the sake of tonight however, I will only post the NodeJS + MongoDB performance metrics tonight. Tomorrow the head to head comparison and any tweaks I learn in between as far as MongoDB goes.

Again all benchmarks were run on my i7-6700k (8x4ghz) and 16gb of DDR4 running @ 3200mhz all on Windows 10 Anniversary Edition. Since we're now dealing with storage, I am running it on a Samsung MZHPV512HDGL M2 PCIe drive.

Running an updated WPF test app, the results were pretty interesting:


Writing 5120 "rows" in 5120 separate calls in 19 seconds isn't bad especially in its unoptimized state. It will be interesting to compare it tomorrow night to WebAPI/SQL Server 2016.

Next up...

Next up as mentioned above, adding in SQL Server 2016/WebAPI comparisons and if time permits tomorrow night after work more deep diving into optimizing the MongoDB/Node.js code. Just in case, all of the code thus far is committed on GitHub.
Over the weekend I decided to finally start a deep dive into Node.js, specificially for a REST API. Coming from a background in C# based SOAP and REST Web Services using ASP.NET I figured this would be the target of my deep dive into node.js.

Prerequisites

To get started I downloaded node.js v6.5 Current for Windows (x64). In addition I installed the Node.js Tools 1.2 for Visual Studio 2015, available here. You will also as indicated on the download link need Visual Studio 2015 with Update 3 RTM or higher (I used Update 3 for those curious).

Node.js Deep-Dive

In watching the MVA on Jump Starting Node.js, I started investigating the Express framework for Node.js seeing this as something similar to how WebAPI was to MVC (before the unification recently).

Code wise for this first deep dive I wanted to keep it simple with a GET request that returns the current date time:

var express = require('express'); var app = express(); app.get('/api/Test', function(req, res) { res.json({ message: new Date().toLocaleString() }); }); app.listen(1338);

Pretty simple, include the express module and then define the route to listen in on like you would with AttributeRouting in WebAPI.

Benchmark Setup

To start off, all of the code is in a new github repo DaysInNode. Each day I will make a new folder to keep the tests separate for historical purposes.

Benchmarking wise I wrote a little WPF app utilizing HttpClient to create several GET requests from 10 to 5260 - knowing this is not a 100% true test, again wanted to keep it simple for Day 1.

All benchmarks were run on my i7-6700k (8x4ghz) and 16gb of DDR4 running @ 3200mhz all on Windows 10 Anniversary Edition.

Benchmark Results

To my surprise Node.js was 3X faster than WebAPI (.NET Core) and in some cases even more so as shown in the graph below. In addition memory usage never exceeded 24.8mb while using IIS Express/WebAPI utilized 3.4mb and Kestrel/WebAPI utilized 135mb.



Next up...

Next up in testing is using a NoSQL database on both ends to test a more realistic scenario such as recording a "Like" or "Dislike".
TAGS