docka - a CLI for common Docker commands

Using Docker for as a development tool is great but if your developer life is spent typing docker-compose and docker exec -it <container name> /bin/bash all day you start looking for help. You could add and maintain some alias entries for your commands into your bash profile but why bother when you can just type npm i docka -g and get all the benefits.

I created docka, a simple CLI (command line interface) to make those repetetive commands easier to type and remember. A full list of commands can be found here which include ssh, stop/start/restart, logs, prune, cleanup and more.

Let me know in the comments if it makes your life easier.

Setting up and formatting Papertrail log levels

Papertrail is a great tool for collecting logs on a budget but the alerts seem lacking as it's either not possible or not clear how to filter on Node.js log levels.

Below I'm going to explain the full walkthrough of setting up Papertail on your Node.js app, how to only log production to Papertrail and how to setup custom log formatting.

Firstly setup your logger:

npm i winston --save
npm i winston-papertrail --save

Setup the logger (in your app.js, server.js etc)

const winston = require('winston');

const winstonPapertrail = new winston.transports.Papertrail({
    host: '',
    port: 12345,
    hostname: 'myapp',
    level: 'debug',
    logFormat: (level, message) => {
        return`[${level.toUpperCase()}] ${message}`;

const consoleLogger = new winston.transports.Console({
    level: 'debug',
    colorize: true

winstonPapertrail.on('error', (err) => {
    console.log('Papertrail error: ', err);

// only log to papertrail in production
const transportArray = process.env.NODE_ENV === 'production' ? [winstonPapertrail, consoleLogger] : [consoleLogger];

// setup transports
const logger = new winston.Logger({
    transports: transportArray

This will basically pipe all logging to Papertrail when running in NODE_ENV of production, all other times everything gets logged to the console for debugging/developing.

Where the magic happens is here (above):

logFormat: (level, message) => {
    return`[${level.toUpperCase()}] ${message}`;

Obviously however you want your logs displayed is up to you but I like my logs formatted:

[ERROR] There was an error
[INFO] There was an info
[WARN] There was an warn
[DEBUG] There was an debug

You can control these using our new logger instance. Eg:

logger.error('This is a error');'This is a info');
logger.warn('This is a warn');
logger.debug('This is a debug');

You can then jump into Papertrail and add the alerts on whatever level you wish (Eg: Error):


All done! Let me know what you think in the comments!

Best practices for running Node.js in production

Companies like PayPal, Twitter and Walmart have shown that Node.js is very much production ready. Stats show that it's a perfect platform for API's and there are many developers with knowledge of front and back but what do you need to know before running in production?

1. Write clean and secure code

This sounds silly and vague but it’s super important. Here are a few tips to ensuring you code is as secure as possible:

  1. Write code with the least external modules as possible. If you chose to use an NPM module, ensure it’s a reputable one and preferably one which you have looked through the code base.
  2. If you are using a framework, ensure you are using security modules and/best practices. Eg: Express uses the Helmet module etc.
  3. Ensure that if you are using a framework that you are using an up to date version.
  4. Lock versions of your NPM modules dependencies to ensure not only your application stays stable but also a modules doesn’t get updated with a security issue or bug
  5. Ensure your dependencies are checked against an exploit checker like Skyk.

2. Run your application using a process manager

Run your application using PM2 or Forever will ensure it’s up and accessible at all times. The idea of a process manager is to ensure your application starts with your server and restarts when errors occur. Some process managers like PM2 also support monitoring services like Keymetrics which manage exceptions and monitor bugs for high memory and CPU usage.

3. Catch any exceptions

It’s generally not good to swallow uncaught exceptions but I’m a believer that if you monitor and act on them they can be really handy for application stability. Catching them can be done like:

process.on('uncaughtException', (err) => {

4. Log, log and log some more

You should create good, readable logs which is output to a searchable log store. You should monitor these logs, constantly improve the logs and ensure your application is fixed for any reoccurring issues and exceptions (see point 3).

Is there anything I missed? Let me know.

Finding an Unhandled Promise rejection

If you have worked with Node.js and Promises you will know that tracking down an Unhandled Promise rejection can be quite difficult as the stack doesn't state where the unhandled Promise is located. When you have a small app with very little code, it's still relatively easy to track down but when your app is huge... it's damn near impossible.

Here is a little snippet of code to help you track down these pesky little buggers!

process.on('unhandledRejection', function(err, promise){
    console.error('[ERROR] Unhandled rejection (promise: ', promise, ', reason: ', err, ').');

Thank me later.

parsa - Parse date string (with format) into a Date() Object & more

Introducing parsa, a library which can be used in the browser and in Nodejs to parse or validate the following:

  • String formatted (with format) dates into a Date Object
  • Validates IP (true/false)
  • Parses a URL query string into an Object
  • Parses a URL into an Object with Host, Path, Hash, Protocol etc
  • Validates Email address (true/false)
  • Extracts numbers and decimals from string
  • Extracts words from string
  • Checks for a secure password (8 Characters, uppercase, lowercase, number & special characters)

The most important feature of parsa is the ability to parse a date in a given format into a Javascript Date() Object. This is particularly useful when you need to convert extract parts of a date which are not in a format which can be parsed using Date.Parse. Sometimes you might extract parts of the date and reformat it or possilbly use that date to insert into a Database date formatted field. There are other libraries which can acheive this Moment.js but they are much larger in size and not suited to the browser if only performing one task.


<script type="text/javascript" src="dist/parsa.min.js" charset="utf-8"></script>
    console.log('parseDate: 20121125 = ', parsa.parseDate('20121125', 'YYYYMMDD'));
<script type="text/javascript" src="" charset="utf-8"></script>
const parsa = require('parsa');
parsa.parseDate('20121125', 'YYYYMMDD');


npm test


gulp deploy


The parseDate function takes a date string and format string parameters and returns a Javascript Date() Object.

parsa.parseDate('20121125', 'YYYYMMDD')


Sun Nov 25 2012 01:00:00 GMT+0100 (CET)

Supported formats
  • Do, MMMM, YYYY


The validateIp function takes an IP address string and returns a boolean value whether it is valid or invalid.





The parseQuery function takes a URL and returns an Object of the Query string parameters.



    "category": "4",
    "product_id": "2140",
    "query": "lcd+tv"


The parseUrl function takes a URL and returns an Object of the URL section.



    "url": "",
    "protocol": "https",
    "host": "",
    "port": ":80",
    "path": "/dir/1/2/",
    "file": "search.html",
    "query": "?arg=0-a&arg1=1-b&arg3-c",
    "hash": "#hash"


The validateEmail function takes a email address string and returns a boolean value whether it is valid or invalid.





The extractNum function takes a string and returns an array of numbers/decimals found in that string.

parsa.extractNum('This is a10 string with3.14decimals6 and numbers.')




The extractWords function takes a string and an array of words and returns an array of matched words in the string.

var words = ['this', 'some', 'words'];
parsa.extractWords('thisadkfdlfkdisdsstringdfjdkwithdkfdfkldsomefdfdfkdflkwordsjfgjkfg', words)




The securePassword function takes a password string returns a boolean whether it's a secure password.


Password requirements are set to standard defaults:

  • at least 8 characters
  • must contain at least 1 uppercase letter, 1 lowercase letter, and 1 number
  • Can contain special characters



Handlebars/Moustache vs Pug/Jade

There are a few template engines you can choose from but handlebars (or moustache) and Pug (formerly Jade) are easily the most popular.

The obvious differences between the two is the syntax. Handlebars offers a familiar HTML syntax that people know and like and uses the double moustache {{}} for data templating and variable insertion. Pug on the other hand uses a totally different syntax which vaguely reflects a normal HTML file layout but has no closing tags and uses indentation.

After looking past the obvious differences which lead me to initially use handlebars purely on syntax familiarity, they offer big differences under the hood.

Which one is best?

Pug has a lot of power baked into the engine. You can do practically everything you want to do without having to get plugins or write helpers. Pug has a huge range of helpers which once you get the hang of the syntax, are intuitive and easy to use. Due to the template files not containing closing tags, the files are smaller, easier to read and maintain. As indentation is key to pug you find yourself really concentrating on it and the file layout and format is easy to read because of it.

I encourage you to check them both out. Don't be afraid of the different syntax of pug because once you get the hang of it, you will love the power.

Adding related posts to your Ghost blog theme

Adding related posts to your Ghost blog theme can be really handy to show content that readers may be interested in.

The following code is a simple Bootstrap markup example on adding related posts. This example uses the Ghost get command to retrieve the last 3 (limit="3") posts which do not contain the ID of the current post. You will notice the code below exists within the {{post}} tag of the post.hbs file as we are using the {{id}} of the current post as a filter. If you want to filter based on another value, you can easily add this code after the closing {{/post}} tag.

File: post.hbs

<div class="row">
    <div class="col-xs-12 col-md-8 col-md-offset-2">
        <h4 class="text-center">You may also like...</h4>
        {{#get "posts" limit="3" filter="id:-{{id}}"}}
            {{#foreach posts}}
                <div class="col-xs-12 col-md-4">
                    <a href="{{url}}">{{title}}</a>

If you have any feedback, please let me know in the comments.

Using the code/markdown backtick in iPhone/iPad (iOS) device

When writing articles with code or syntax highlighting you may want to use the backtick (eg: `).

It's not very obvious on how to do with this on an iPhone/iPad (iOS) device. Here is how:

  1. Press the 123 button on your iOS keyboard
  2. Hold on the apostrophie/forward tick (eg: ')
  3. Slide your finger over to the far left to select the backtick

See here:

backtick screenshot

If you are doing this a lot, you may want to create a keyboard shortcut.

To do this:

  1. Select Settings
  2. Then General
  3. Then Keyboard
  4. Select Text Replacement
  5. Tap the + in the top right of screen
  6. Type a backtick (see above) in the Phrase section
  7. Type ;; in the Shortcut - or anything else which you will remember.

Then you can access a backtick quickly by typing ;; on your iOS keyboard!


args - Making CLI command line tools easier to create

If you have attempted to make a CLI in Node.js you would know there are a few packages available at your disposal. You would also know that some seems to do too much and are complicated to get running. args makes building CLI apps a breeze.


npm install args --save

Getting started

Creating a CLI is as easy as:

#!/usr/bin/env node

import args from 'args'

  .option('port', 'The port on which the app will be running', 3000)
  .option('reload', 'Enable/disable livereloading')
  .command('serve', 'Serve your static site', ['s'])

const flags = args.parse(process.argv)

This will allow for 3 args: port, reload and server. It will also automatically generate a helper (below) and suggest possible commands/args if the user typed it wrong.

  Usage: haha [options] [command]


    serve, s       Serve your static site
    help           Display help


    -v, --version  Output the version number
    -r, --reload   Enable/disable livereloading
    -h, --help     Output usage information
    -p, --port     The port on which the app will be running

See more:

GithubDocs - A Nodejs SPA built using Local Markdown or docs in your Github repo

GithubDocs builds a Single Page Application (SPA) using Markdown docs from your Github repo or a local directory!

Simply create some Markdown documents/files in your Github repo (or locally), setup the GithubDocs config file with your Github repo details and it's done. Your docs are indexed and displayed in a great app with full text search and beautiful responsive design.

See a demo here:


Install it here:

Stop / remove all Docker containers

Sometimes a failed push using Dokku or some issues with Docker causing unwanted containers can cause your server some grief.

To stop and remove all containers simply:

# docker stop $(docker ps -a -q)
# docker rm $(docker ps -a -q)

Introducing miniPaaS - A mini push-to-deploy PaaS platform

miniPaaS is a tiny push-to-deploy PaaS platform allowing you to simply push code to your server quickly and easily. Check it out on NPM.

With miniPaaS you simply run the following commands to push code to your server:

  1. git add .
  2. git commit -m "my commit message"
  3. minipaas deploy

You code will then be packaged up, pushed to your server and your application is restarted.

miniPaaS uses pm2 on the server to manage the process and auto restarting on errors.

Installing miniPaaS is so so easy:

Local machine:

npm install minipaas -g

Remote server

You can install these individually by using the following commands (skip anything which is already installed):


  • Install Nodejs: curl -sL | sudo -E bash - && apt-get install nodejs
  • Install Unzip: apt-get install unzip
  • Install PM2: npm install pm2 -g


  • Install Nodejs: curl --silent --location | bash - && yum -y install nodejs
  • Install Unzip: yum install unzip
  • Install PM2: npm install pm2 -g

miniPaaS is currently in BETA. Any feedback or bug reports you can provide to make it better would be greatly appreciated -

Dokku - Could not read from remote repository on digital ocean

Firing up a digital ocean droplet with one-click dokku should be easy right? Yeah well if you get this error it's due to your SSH keys not being added correctly either when setting up the droplet or if you did it yourself.

You simply need to run:

cat ~/.ssh/ | ssh root@droplet_ip_address "sudo sshcommand acl-add dokku laptop"

Adding new lines to your IFTTT recipes

Adding breaks or new lines to your ifttt recipes can be a difficult task. Facebook seems to be particularly picky with it's new line characters where standard new lines like \r \r\n and <br> are ignored.

The solution:


No worries, glad I could help!

Building a reliable and scalable Node.JS SaaS application

The SaaS app we built is an FAQ/Knowledge base and support ticketing platform called ezyFAQ -

Having built many Node.Js projects, this would be our first venture into building a scalable SaaS app. After our initial investigation on where we should start, we couldn't find much advice on where to start and things to look out for. We found vague articles on various projects built many years ago but nothing using modern tech, in particular Node.JS.

We are intending this article to be helpful to anyone wanting to build a SaaS app using Node.Js.


Up until this project we built our apps on self managed Digital Ocean VM's. This was fine in isolation but we found it difficult to find any information on scaling and load balancing to grow with the app user base.

We decided to go with a dedicated Node.Js host (Heroku) which used the Dyno type approach. This seemed like the best approach to easily scale as the customer base and load grows.


Generally our database of choice to pair with Node.JS is MongoDB. After trying and considering various other databases, we decided to stick with MongoDB.

Hosting MongoDB yourself is easy enough but we wanted something more reliable with load balancing/redundancy, scaling and backups. There are various options from MongoDB Atlas,, mLab etc. After some consideration, we went with mLab for easy of use, scalability and best price.

Application structure

This is where we spent most of our time trying to figure out the best approach. There are two parts to the app: the front and backend. The frontend is the part of the app which would see all the public traffic. Each customer of our app would have their own FAQ with a subdomain (and optional custom domain) which would see significant traffic. The backend is the management side for our customers where they would manage settings, content, style and more. The backend would receive minimal traffic in comparison to the public facing frontend and so would have much less of a need to scale.

Instead of creating one big app we decided to split them out and run on seperate Heroku plans. This way we can scale the frontend easily whilst leaving the backend as it. It also means we can easily do maintenance, add features etc without affecting the public facing side of the app.


We learnt a lot. First of all, we learn't that making a standalone app into a SaaS is not as easy as it sounds. There are many different aspects which need to be considered and worked through. We found that scalability and being flexible was the key to our success and this is where we spent most of our time. We also found that doing everything and managing everything is not the always the best thing. Leave the server and DB hosting to a dedicated company to manage it for you. As a startup, you can't possibly be professional and perfect at everything. You can always bring services back in house as you grow and your available skill set grows too.

We would love to hear feedback from others who have faced similar hurdles getting their SaaS app off the ground and how they dealt with them.

ezyFAQ: An easy-to-use yet beautiful and powerful FAQ/Knowledge base

ezyFAQ is a very powerful yet affordable solution to setup a FAQ/knowledge base without all the complexities (and cost) of Zendesk and other solutions. Studies have shown that most customers would much prefer to quickly find the solution themselves rather than wait on an email response or make a phone call.

ezyFAQ allows for customising your FAQ/knowledge base with branding, CSS and HTML. ezyFAQ also allows you to bring your own domain for a seamless integration with your existing website - e.g: The live search, analytics, responsive design (also beautiful on Tablets and Phones), pre-built themes and templates allow you to customise a little or a lot!

ezyFAQ runs its own FAQ using the ezyFAQ platform which you can view here:

More information can be found at

Writing your first Node.js module

This isn't meant to be an exhaustive guide on how to write an NPM module. This guide is meant to be a simple working example where you can see a basic working module and easily adapt this to create your own module.

You can see the basic structure is really easy to understand. We are exposing the multiply() function as a public function by returning in the module.exports. The other function aptly named nonPublic() is called by the multiply() function but cannot be called publicly. More on this below.

You can see our multiply() function takes two values, multiplies them and returns a label from our nonPublic() function, followed by our multiplied value. Easy!

File: multiply.js

// require any modules

module.exports = {
	multiply: function (val1, val2, callback){
        var returnedValue = val1 * val2;
		callback(null, nonPublic() + returnedValue);

function nonPublic(){
    return 'Result: ';

File: test.js

Using our new module locally for testing is easy:

var mod = require('./module');


mod.multiply(5, 10, function(err, result){

The first line requires our local module. Note: the ./ value for modules located in the same directory.

After we have required it we can go ahead and use it. First we call the nonPublic() function to show it doesn't work publicly (this outputs an error), then call the multiply() function.

We pass in 5 and 10 to be multiplied together and we write the result to the console.

To run our test.js script we simply run the following in our console and observe the output:

node test.js


This is a really basic module which outlines the basic steps to get started on writing your first NPM module.

One of my slightly (hardly) more advanced (has options etc) modules metaget can be found here as further reading:

Ensure Express App has started before running Mocha/Supertest tests

Seems simple enough but when running tests, I ran into a problem where Mocha/Supertest was not waiting for my Express App to fully start running tests.

The relatively easy way to overcome this is to use an event emitter in your Express app and wait for that to complete before starting your tests. This doesn't appear to be documented anywhere obvious.

You will need to setup the event emitter in your Express app which is the final step before assuming the app has started and is ready. In my case, I had made the DB connection etc and now the call to app.listen was my final event.

Here is an example:

app.listen(app_port, app_host, function () {
    console.log('App has started');

The specific line is:


This creates an event which we can wait on called appStarted (this can be changed to whatever you want).

Next we need to wait for this event in our Mocha/Supertest tests (test.js).

First we will require our Express app. Note: app is my main Express file, some people use server.js and this value would then become require('../server'):

app = require('../app');

We then need to create a Supertest agent using our Express instance:

var request = require("supertest");
var agent = request.agent(app);

Then we wait for our Express event using before():

before(function (done) {
    app.on("adminMongoStarted", function(){

Then we can kick off all our tests. A full test example:

var request = require("supertest");
var assert = require('chai').assert;

app = require('../app');
var agent = request.agent(app);

before(function (done) {
    app.on("appStarted", function(){

describe("Add config",function(){
    it("Add a new connection",function(done){
            .expect("Config successfully added", done);

markdownTables - Convert your HTML tables into Markdown syntax online

markdownTables is an online tool which enables you to paste in your HTML table code and convert it to Markdown table syntax.


authorStats : Get your NPM package download statistics in an easy to read command line table

authorStats fetches your daily/weekly/monthly download stats for all your authored NPM packages and outputs a nice table right in your command line.


It's best to install the package globally:

npm install author-stats -g


authorStats <npm username>

Where <npm username> is the username on the NPM website. My profile is: and username is mrvautin.

A nice command line table with the daily, weekly and monthly download numbers of all your packages will be output to your terminal.

Note: If you have a lot of packages you will need to be patient while authorStats fetches the data.


ghostStrap - A minimalist and responsive Bootstrap theme for the Ghost blogging platform

Upon setting up my Ghost blog, I wanted a themewhich was compatible with Bootstrap as I'm familiar with the layout and it's rock solid in terms of responsive design. I was surprised to find that either the Bootstrap themes was really old and out of date or were way over the top and not a good starting point to add my touches.

This pushed me to design ghostStrap which can easily be used as a starting point for anyone wanting to create a theme using the Bootstrap standard.


Some commands may need sudo

  1. From the root of your Ghost install: cd content/themes/
  2. git clone
  3. Restart Ghost
  4. Visit the admin panel: http://localhost:2368/ghost
  5. Select General
  6. Select ghostStrap from the Theme dropdown
  7. Have fun

Please leave a comment if you use the theme or have any feedback.



Single post
Single Post


Mobile layout
Mobile layout

Mobile layout menu

Metaget - Nodejs module to fetch remote Meta Tags (including Open Graph) from URL

A Node.js module to fetch HTML meta tags (including Open Graph) from a remote URL


npm install metaget --save


var metaget = require("metaget");
metaget.fetch('', function (err, meta_response) {

Response will be a Javascript Object containing all the meta tags from the URL. All tags are output in the example above. Some tags with illegal characters can be accessed by:



It's possible to set any HTTP headers in the request. This can be done by specifying them as options in the call. If no options are provided the only default header is a User-Agent of "request".

This is how you would specify a "User-Agent" of a Google Bot:

var metaget = require("metaget");
metaget.fetch('',{headers:{"User-Agent": "Googlebot"}}, function (err, meta_response) {


  1. Fork it!
  2. Create your feature branch: git checkout -b my-new-feature
  3. Commit your changes: git commit -am 'Add some feature'
  4. Push to the branch: git push origin my-new-feature
  5. Submit a pull request :D


adminMongo is a Web based user interface (GUI) to handle all your MongoDB connections/databases needs. adminMongo is fully responsive and should work on a range of devices.

adminMongo connection information (including username/password) is stored unencrypted in a config file, it is not recommended to run this application on a production or public facing server without proper security considerations.


  1. Clone Repository: git clone && cd adminMongo
  2. Install dependencies: npm install
  3. Start application: npm start
  4. Visit in your browser

Demo (read only)

A read only demo can be seen here


  • Manage from a connection level for easy access to multiple databases
  • Create/Delete databases
  • Create/Delete/Edit collection
  • Create/Delete/Edit documents
  • Create/Delete indexes
  • Query documents
  • Collection statistics
  • Export collections in JSON format


  • Documents need to have an "_id" value which is a string, integer, or MongoDB ObjectId. Documents using Composite ID indexing is currently not supported.


adminMongo will listen on host: localhost and port: 1234 by default.
This can be overwritten by adding a config file in /config/app.json. The config file can also override the default 5 docs per page.
The config file options are:

    "app": {
        "host": "",
        "port": 4321,
        "docs_per_page": 15

Note: Any changes to the config file requires a restart of the application


Create a connection

After visiting you will be presented with a connection screen. You need to give your connection a unique name as a reference when using adminMongo and a MongoDB formatted connection string. The format of a MongoDB connection string can form: mongodb://<user>:<password>@<port>/<db> where specifying to the <db> level is optional. For more information on MongoDB connection strings, see the official MongoDB documentation.

Note: The connection can be either local or remote hosted on VPS or MongoDB service such as MongoLab.

adminMongo connections screen
The Connection setup screen

Connection/Database admin

After opening your newly created connection, you are able to see all database objects associated with your connection. Here you can create/delete collections, create/delete users and see various stats for your database.

adminMongo database screen
The connections/database screen


After selecting your collection from the "Database Objects" menu, you will be presented with the collections screen. Here you can see documents in pagination form, create new documents, search documents, delete, edit documents and view/add indexes to your collection.

adminMongo collections screen
The collections screen

Searching documents

You can search documents using the Search documents button on the collections screen. You will need to enter the key (field name) and value. Eg: key = "_id" and value = "569ff81e0077663d78a114ce".

You can clear your search by clicking the Reset button on the collections screen.

adminMongo search documents
The collections screen


Adding and editing documents is done using a JSON syntax highlighting control.

adminMongo documents
Editing a document


Indexes can be added from the collection screen. Please see the official MongoDB documentation on adding indexes.

adminMongo documents
Viewing/Adding indexes


  1. Fork it!
  2. Create your feature branch: git checkout -b my-new-feature
  3. Commit your changes: git commit -am 'Add some feature'
  4. Push to the branch: git push origin my-new-feature
  5. Submit a pull request :D

Easy way of moving notes from iPhone to iCloud

You may experience an issue where older notes are considered "ON MY IPHONE" and not backed up to iCloud you have two options. 1: manually copy all notes into new ones which by default sit on iCloud. 2: follow the steps below.

  1. Select "Notes" section below "ON MY IPHONE".
    notes on my iphone
  2. Select "edit" (top right), select notes manually or select "Move All" (bottom left)
    select notes
  3. Select "Notes" under "ICLOUD" section to copy notes.
    select iCloud

NOTE: The app may not update the number of notes until it's closed and reopened.

openKB - Open Source Nodejs Markdown based knowledge base (FAQ) app

openKB is an open source Markdown based Knowledge base application (FAQ) built with Nodejs and ExpressJS. The application uses an embedded database (nedb) for easy installation without a full Database server.
The application is designed to be easy to use and install and based around search rather than nested categories. Simply search for what you want and select from the results.



  1. Clone Repository: git clone && cd openKB
  2. Install dependencies: npm install
  3. Start application: npm start
  4. Go to in your browser


  • Search: openKB is a search based Knowledgebase (FAQ) backed by Lunr.js indexing to create the best possible results on searches.
  • Backend: openKB uses the pure javascript nedb embedded database. This means no external databases need to be setup.
  • Design: openKB is meant to be simple flat design. With that said, openKB is very customisable by adding your CSS file to /public/stylesheets/ and adding a link in /views/layouts/layout.hbs you can add your own styling and graphics.
  • Responsive: openKB is built using Bootstrap allowing it to be responsive and work on all devices. The admin can be a little difficult editing Markdown on smaller screens.
  • Mermaid: openKB allows for Mermaid charts in articles.
  • Editor: openKB uses Markdown-it which is based off the CommonMark spec. This allows for the very best Markdown experience.
  • Image management: openKB allows for drag and drop of images into articles. The image is automatically uploaded to the server in the background. Google Chrome users can also paste images directly from the clipboard.


Article view
Article filtering



A new user form will be shown where a user can be created.


There are are a few configurations that can be made which are held in /routes/config.js. If any values have been changed the app will need to be restarted.

Running in production

Using PM2 seems to be the easiest and best option for running production websites.
See the PM2 for more information or a short guide here: