Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

Node Js delivering files to Client

Node Js delivering files to Client

Problem

When you send an HTML file to the client, it gets parsed and makes GET calls for anything else like or (is included in that? I think it is).

It is very expensive to do this process:

  • Ask for file - CLIENT
  • Read file from disk - NODE
  • Publish contents to Response Stream - NODE
  • Read response stream - CLIENT

Is there anyway to cache the file contents in Node when the server starts up in order to simply do

  • Ask for file - CLIENT
  • Publish contents to Response Stream - NODE
  • Read Response Stream - CLIENT

I've got this Function to go through all the scripts in the Scripts/ folder and read the files:

var gatherFiles = function () {
    fs.readdir('/Scripts', function (err, files) {
        files.filter(function (file) {
            return path.extname(file) == '.js';
        }).forEach(function (file) {
            // read files here
        });
    });
};

Bare in mind these points

  • Its on a Raspberry Pi, it DOESN'T need to be scalable
  • I don't mind if it is expensive at the start

I assume, that because the server should only be started once, that if all the files were cached in node at the beginning, then if they were ever needed then they can be served up from memory. The files are script files so they aren't massive, 100Kb max really (after minified probably much less).

I ask this, because when I've done the Chrome and Firefox network monitor to see how long it takes to serve out the files, and it takes a very long time, over 100ms....

I'm guessing there is latency across many parts of the server: ethernet, SD card, code.

So I think the way I want to do it with pre caching, it should minimise that 100ms lag.

* My Solution *

var fs = require ("fs"),
    path = require("path"),
    async = require("async"),
    scriptFiles = {};

function gatherFiles(callback, folder) {
    fs.readdir(folder, function (err, files) {
        var filteredFiles = files.filter(function (file) {
            var ext = path.extname(file);
            return ext == '.js' || ext == '.html';
        });
        aync.each(filteredFiles, function (file, asynCallback) {
            if (!scriptFiles[file])
                fs.readFile(folder + '/' + file, function (err, data) {
                    scriptFiles[file] = data;
                    asyncCallback();
                });
        }, function (err) {
            callback(scriptFiles);
        });
    });
};

exports.gatherFiles = gatherFiles;

The key points to note are:

  • Async each requires you to have a callback fired to let it know that a function has completed, when it has detected all callbacks have been fired then it will call the final callback, also this allows for errors to permiate through.
  • In the final callback it calls the main function callback and passes through the file dictionary with data, the anonymous function should then start the http server

usage:

var cacher = require("./cacher");
cacher.gatherFiles(function (fileDictionary) {
    http.createServer(onRequest, 8888);
}, './Scripts');
Problem courtesy of: Callum Linington

Solution

I think something like this would meet what you're looking for, which is trying to eliminate the disk read.

function Files = function () {
    var files = {};

    return function (filepath, callback) {
        // If cache is available, return that
        if (files[filepath]) return callback(null, files[filepath]);
        // Otherwise, get it, then store it
        fs.readFile(filepath, function (err, data) {
            if (err) return callback(err);
            files[filepath] = data;
            callback(null, data);
        });
    }
}();

Then call it as such

Files('/scripts/whatever.js', function (err, data) {
   // Do something
});

If you want to, you can add in other functions such as file watching or cache expiration you can.

If you want to cache it all in the beginning, you can just iterate through whatever directory and do nothing with the Callback.

For that I'd modify it as such:

//...
if (files[filepath]) return callback && callback(null, files[filepath]); 
//...
callback && callback(null, data);
//...

The callback && callback(...) will be treated the same as if (false && alert("Nothing")) in that the alert will never be reached, because the falsy condition prevents further execution. As such, if you don't pass a callback, it will never try to execute the callback. Sort of a fail safe.

Solution courtesy of: Robert

Discussion

View additional discussion.



This post first appeared on Node.js Recipes, please read the originial post: here

Share the post

Node Js delivering files to Client

×

Subscribe to Node.js Recipes

Get updates delivered right to your inbox!

Thank you for your subscription

×