Node.JS Module Patterns

1. The simplest module

2.Module Patterns
2.1. Define a global

Node: Don’t pollute the global space

2.2. Export an anonymous function

2.3. Export a named function

2.4. Export an anonymous object

2.5. Export a named object

2.6. Export an anonymous prototype

2.7. Export a named prototype


  • Named exports – allow you to export multiple ‘things’ from a single module
  • Anonymous exports – simpler client interface

CMS choice

I’m looking for a PHP based CMS, easy to scale and flexible enough to be the starting point for the next generation of e-commerce website for the hosting company I work for.

Why PHP ? I started by adding scalability on top and I felt nodejs was the answer. However things are not pretty generous on CMS side. There are few names but they look at the early stage of development or suffering from the lack of features: KeystoneJS, Pencilblue, Apostrophe, Ghost.

Note: Reaction Commerce – is a really interesting project as they are the only one building something for the eCommerce industry. They are using Meteor, Node.js (note: interesting combination), MongoDB and CoffeScript and it is launched as a Docker container.

I’m going to collect the strengths and weaknesses of few PHP options available with the mention I’m going to write a separate post for Reaction Commerce.

We’ll discuss about:

  • Expression Engine
  • Craft
  • ProcessWire

ExpressionEngine is built by EllisLab, a company that also created CodeIgniter, a popular PHP framework for building robust web applications. ExpressionEngine 2.x is built on top of CodeIgniter.

Craft is built by Pixel and Tonic, a company who, interestingly, got started creating third-party add-ons for ExpressionEngine. Their add-ons – Playa and Matrix – are well-built, renown plugins within the ExpressionEngine community.

ProcessWire – It’s basically PHP with a really extensive jQuery-like API – so literally anything is possible.

Data modelling

A model is simply a type of content your site stores. You might have a “blog post”, “product”, or “staff member” model. ExpressionEngine calls these model types a channel while Craft calls them a section.

Flexibility of the model by custom fields.


  • Responsive control panel
  • Live preview
  • Entry draft/version functionality
  • Has several pricing options to fit your needs
  • Custom entry types (if you have several “types” of blog posts that differ in content/layout)


  • More add-ons for things like e-commerce
  • Been around longer
  • Well known within large companies

Craft uses Twig as its template engine.

Additional links:


Install latest nodejs on ubuntu

Note for myself: If you want to “dockerize” latest version of nodejs don’t forget below:


for v5.0

NodeJS Non-blocking architecture

JavaScript is a dynamic, object-oriented, and functional scripting language. One of the features that make it win over Java Applets in the browser scripting war decades ago, it was its lightness and non-blocking event loop.

Bocking means that when one line of code is executing the rest of it is locked waiting to finish. On the other hand, non-blocking gives to each line of code a shot and then through callbacks it can come back when an event happens. Programming languages that are blocking (Java, Ruby, Python, PHP, …) overcomes concurrency using multiple threads of execution while JavaScript handles it using non-blocking event loop in a single thread.

Image from

Some companies like Paypal moved from Java backend to NodeJS and reported a increased performance, lower average response times, and development speed gains. Similarly happens to Groupon that came from Java/Rails monoliths.

Node.JS Tools

1. NPM – The Node Package Manager

When discussing Node.js, one thing that definitely should not be omitted is built-in support for package management using the NPM tool that comes by default with every Node.js installation. The idea of NPM modules is a set of publicly available, reusable components, available through easy installation via an online repository, with version and dependency management.

Some of the most popular NPM modules today are:

  • express – Express.js, a Sinatra-inspired web development framework for Node.js, and the de-facto standard for the majority of Node.js applications out there today.
  • connect – Connect is an extensible HTTP server framework for Node.js, providing a collection of high performance “plugins” known as middleware; serves as a base foundation for Express.
  • and sockjs – Server-side component of the two most common websockets components out there today.
  • Jade – One of the popular templating engines, inspired by HAML, a default in Express.js.
  • mongo and mongojs – MongoDB wrappers to provide the API for MongoDB object databases in Node.js.

NPM usage:

The best way to manage locally installed npm packages is to create a package.json file.

A package.json file allows you to:

  • documentat what packages your project depends on
  • it allows you to specify the versions of a package that your project can use using semantic versioning rules
  • makes your build reproducible which means that its way easier to share with other developers.

To specify the packages your project depends on, you need to list the packages you’d like to use in your package.json file. There are 2 types of packages you can list:

  • "dependencies": these packages are required by your application in production
  • "devDependencies": these packages are only needed for development and testing

The above package.json specify that the app uses any version of the package my_dep that matches major version 1 in production, and requires any version of the package my_test_framework that matches major version 3, but only for development.

2. Nodemon

This is a tool to manage node processes during development. With nodemon you can start a node process and it keeps it running for. It utilizes fsevents to hook into filesystem changes and it restarts the node process on each file change.

You can install it using npm using the following command. I like to install it globally so I can use it for all projects, but you can remove the -g to install it locally instead.

Now instead of using node server.js  to run your application, you can use nodemon server.js. It will watch for any changes in your application and automatically restart your server for you.

3. Node inspector

Node Inspector is a debugger interface for Node.js applications that uses the Blink Developer Tools. The really cool thing is that it works almost exactly as the Chrome Developer Tools.

You should be a flavour of Chrome browser (Chrome, Chromium, etc) installed.

Once it is installed, you can run it using the following command. This will start the debugger and open your browser.

Can you combine nodemon with node inspector ? You would start your server with nodemon --debug server.js and then you’ll need to run node-inspector in a separate terminal window unless you push nodemon to the background.

4.  Helmet

Can help protect your app from some well-known web vulnerabilities by setting HTTP headers appropriately.

Helmet is actually just a collection of nine smaller middleware functions that set security-related HTTP headers:

  • csp sets the Content-Security-Policy header to help prevent cross-site scripting attacks and other cross-site injections.
  • hidePoweredBy removes the X-Powered-By header.
  • hpkp Adds Public Key Pinning headers to prevent man-in-the-middle attacks with forged certificates.
  • hsts sets Strict-Transport-Security header that enforces secure (HTTP over SSL/TLS) connections to the server.
  • ieNoOpen sets X-Download-Options for IE8+.
  • noCache sets Cache-Control and Pragma headers to disable client-side caching.
  • noSniff sets X-Content-Type-Options to prevent browsers from MIME-sniffing a response away from the declared content-type.
  • frameguard sets the X-Frame-Options header to provide clickjacking protection.
  • xssFilter sets X-XSS-Protection to enable the Cross-site scripting (XSS) filter in most recent web browsers.

5. Express-limiter

Implement rate-limiting to prevent brute-force attacks against authentication.

6. Cluster Service

Turn your single process code into a fault-resilient, multi-process service with built-in REST & CLI support. Restart or hot upgrade your web servers with zero downtime or impact to clients.



As Wikipedia states: “Node.js is an open-source, cross-platform runtime environment for developing server-side web applications. Node.js applications are written in JavaScript and can be run within the Node.js runtime on OS X, Microsoft Windows, Linux, FreeBSD, NonStop,[3] IBM AIX, IBM System z and IBM i. Its work is hosted and supported by the Node.js Foundation,[4] a collaborative project at Linux Foundation.[5]”

Node.js – in simple words – is server-side JavaScript.

The platform and core framework were designed around an event-driven, non-blocking I/O model and constructing a trivial server is as simple as the following script:

The server can be launched by executing the script via Node:

Contributing to Node’s phenomenal growth was an excellent package management system built on lessons learned from other communities. Creating, publishing, and installing a Node module or collection of related modules as a package is simple and fast.

Packages contain their own local copy of installed modules, easing deployment since there is no requirement to install to any common file system locations (and resolve potential version conflicts).

All that is required for a package is a package.json file that contains a bit of metadata about a package and its dependencies. A minimal package.json file only requires a name and version.

Nodejs benefits:

  • Lightweight HTTP server processes — the Node platform is based on Google’s well-regarded open source, high performance V8 engine, which compiles JavaScript to native machine code on supported systems; this machine code undergoes dynamic optimization during runtime. The V8 engine is highly tuned for fast startup time, small initial memory footprint, and strong peak performance.

  • Highly scalable — the Node platform was designed from the onset for end-to-end asynchronous I/O for high scalability.

  • Lightweight for developers — there is minimal ceremony involved in creating and consuming packages, which encourages a high degree of modularization and lightweight, narrowly-focused packages independently versioned and published.

Where Node.js Can Be Used ?

Server-side web applications / REST APIs


    • If your application doesn’t have any CPU intensive computation, you can build it in Javascript top-to-bottom, even down to the database level if you use JSON storage Object DB like MongoDB.
    • Crawlers receive a fully-rendered HTML response, which is far more SEO-friendly than, say, a Single Page Application or a websockets app run on top of Node.js


    • Any CPU intensive computation will block Node.js responsiveness, so a threaded platform is a better approach. Alternatively, you could try scaling out the computation.
    • Using Node.js with a relational database is still quite a pain. Do yourself a favour and pick up any other environment like Rails, Django, or ASP.Net MVC if you’re trying to perform relational operations.

Where Node.js Shouldn’t Be Used?

  • Server-side web application with a relational database behind Relational DB tools for Node.js are still in their early stages; they’re rather immature and not as pleasant to work with. Still, if you’re really inclined to remain JS all-the-way (and ready to pull out some of your hair), keep an eye on Sequelize and Node ORM2.
  • Heavy server-side computation/processing – When it comes to heavy computation, Node.js is not the best platform around. No, you definitely don’t want to build a Fibonacci computation server in Node.js. In general, any CPU intensive operation annuls all the throughput benefits Node offers with its event-driven, non-blocking I/O model because any incoming requests will be blocked while the thread is occupied with your number-crunching. Node.js is single-threaded and uses only a single CPU core. When it comes to adding concurrency on a multi-core server, there is some work being done by the Node core team in the form of a cluster module [ref:]. You can also run several Node.js server instances pretty easily behind a reverse proxy via nginx. With clustering, you should still offload all heavy computation to background processes written in a more appropriate environment for that, and having them communicate via a message queue server like RabbitMQ.


Node.js was never created to solve the compute scaling problem. It was created to solve the I/O scaling problem, which it does really well.

The Revealing Module Pattern

This pattern is the same concept as the module pattern in that it focuses on public & private methods. The only difference is that the revealing module pattern was engineered as a way to ensure that all methods and variables are kept private until they are explicitly exposed; usually through an object literal returned by the closure from which it’s defined.


  • Cleaner approach for developers
  • Supports private data
  • Less clutter in the global namespace
  • Localization of functions and variables through closures
  • The syntax of our scripts are even more consistent
  • Explicitly defined public methods and variables which lead to increased readability


  • The same as Module Pattern


The Module Pattern

Going to js-module-pattern, the pattern is used to mimic classes in conventional software engineering and focuses on public and private access to methods & variables. The aim is to improve the reduction of globally scoped variables, thus decreasing the chances of collision with other code throughout an application.


  • Cleaner approach for developers
  • Supports private data
  • Less clutter in the global namespace
  • Localization of functions and variables through closures


  • Private methods are unaccessible.
  • Private methods and functions lose extendability since they are unaccessible.

 Another example of the module pattern that exposes the module a little differently and makes use of a shared private cache. This method encourages more of an object creation approach where we can optimize performance by being efficient with shared storage.


Using Windows Authentication with Web Deploy and WMSVC

By default in Windows Server 2008 when you are using the Web Management Service (WMSVC) and Web Deploy (also known as MSDeploy) it will use Basic authentication to perform your deployments. If you want to enable Windows Authentication you will need to set a registry key so that the Web Management Service also supports using NTLM. To do this, update the registry on the server by adding a DWORD key named “WindowsAuthenticationEnabled” under HKEY_LOCAL_MACHINE\Software\Microsoft\WebManagement\Server, and set it to 1. If the Web Management Service is already started, the setting will take effect after the service is restarted.

net stop wmsvc
net start wmsvc

To install Web Deploy please use

Katana & Owin – Part I

Katana is a flexible set of components for building and hosting OWIN-based web applications.
OWIN defines a standard interface between .NET web servers and web applications. The goal of the OWIN interface is to decouple server and application, encourage the development of simple modules for .NET web development, and, by being an open standard, stimulate the open source ecosystem of .NET web development tools.

Let’s try to create the ‘clasic’ Hello World! application for Katana.
Start Visual Studio 2012 then create a new Console Application.
Ctrl + Q -> Package Manager Console


for diagnostics

The source code is:

If you want to have a nice welcome screen instead of “Hello world !” just add below line of code and ignore the rest of the code from Configuration method.

Under the hood

Owin describes one main component which is the following interface:

This is a function that accepts a simple dictionary of objects, keyed by a string identifier. The function itself returns a task. The object in the dictionary in this instance will vary depending on what the key is referring to.

More often, you will see it referenced like this:

An actual implementation might look like this:

This is essentially how the “environment” or information about the HTTP context is passed around. Looking at the environment argument of this method, you could interrogate it as follows:

if a HttpRequest was being made to ‘http://localhost:8080/Content/Main.css” then the output would be:

Your Path is [/Content/Main.css]

In addition, while not part of the spec, the IAppBuilder interface is also core to the functioning of an Owin module (or ‘middleware’ in Owin speak):

The IAppBuilder interface acts as the ‘glue’ or host to bring any registered Owin compatible libraries/modules together. With support of this simple mechanism, you can now write isolated components that deal with specific parts of functionality related to Http requests. You can then chain them together to build capabilities of my Http server. You can literally chain together Owin components to form a pipeline of only the necessary features you want.
Basically you can build or use a custom host, then you can insert whatever custom modules into the Http request processing pipeline. Owin provides the specification for writing those modules that make it easy to insert into the chain.

Sample Katana component

Owin components are also known as middleware.

Wrap component in extension method and invoke it like this which resembles the Welcome page invocation.

See owin for full owin specifications.

Chaining components