Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

How is request processing with rails, redis, and node.js asynchronous?

How is request processing with rails, redis, and node.js asynchronous?

Problem

For web development I'd like to mix Rails and node.js since I want to get the best out of both worlds (rails for fast web development and node for concurrency). I know that some people choose to just use full ruby stack with eventmachine that is integrated into rails controller so that every request can be nonblocking by using fiber in event-loop model. I have been able to understand how that works in a big picture.

At this moement however I want to try doing nonblocking request processing with rails and node.js with message queue concept. I heard that this can be achieved by using redis as an intermediary. I'm still having trouble trying to figure out how that works as of now. From what I can understand: so we have 2 apps A (rails) and B (node.js) and redis. rails app will handle requests from users that go through controllers in REST manner, and then from there rails will pass that through redis, and then redis will form queues and node.js app will pick up that queue and do whatever necessary afterhand (write or read from backend db).

My questions:

  1. So how would that improve concurrency and scalability? from what i know since rails handle the requests through controllers synchronously, and then write to redis, the requests will be blocking still, even though node.js end can pickup the queue asynchronously. (I have a feeling that it's not asynchronous yet if it's not end to end non-blocking).

  2. Would node.js be considered a proxy or an application here if redis is the intermediary?

  3. I'm new to redis and learning it still. If I'm using 100% noSQL solution for my backend database, such as mongoDB or couchDB, are they replaceable by redis entirely or is redis more seen as a messaging queue tool like rabbitMQ?

  4. Is messaging queue a different concurrency concept than threading or event-loop model or is it supposed to supplement them?

That's all my question. I'm new to message queue concept. Will appreciate any help and pointers to right direction and articles that help me learn more. thanks.

Problem courtesy of: Benny Tjia

Solution

You are mixing some things here that don't go together.

Let's first make sure we are on the same page regarding the strengths/weaknesses of the involved technologies

Rails: Used for it's web-development simplicity and perfect for serving database-backed web-applications. Not very performant when having to serve a large number of long running requests as you'd run out of threads on your Ruby workers - but well suited for anything that can scale horizontally with more web-nodes (multiple web-servers - 1 db).

Node.js: Great for high-concurrency scenarios. Not as easy as rails to write a regular web-application in it. But can handle near an insane amount of long-running low-cpu tasks efficiently.

Redis: A Key-Value Store that supports operations on it's data-structures (increment/decrement values, append/prepent push/pop to lists - all operations that make this DB work consistently with multiple clients writing at once)

Now as you can see, there is no benefit in having Rails AND Node serve the same request - communicating through Redis. Going through the Rails Stack would not provide any benefit if the requests ends up being handled by the Node server. And even if you only offload some processing to the node server, it's still the Rails webserver that handles the requests and has to wait for a response from node - killing the desired scalability. It simply makes no sense.

Where you would a setup with Node and Rails together is in certain areas of your app that have drastically different scaling requirements.

If you are for example writing a Website that displays live stats for Football games you can easily see that there are two different concerns in your app: The "normal" Site that contains signup, billing and profile stuff that screams for a quick implementation through rails. And the "live" portion of the site where users see live results and you expect to handle a lot of clients at once - all waiting for something to happen (low cpu - high concurrency).

In such a case it may be beneficial to actually seperate the two parts of the site into a Ruby and a Node app, with then sharing data about the user through a store like Redis (but actually you just need some shared state that both can look at and write to for synchronization purposes).

So you would use for example Rails for the Signup/Login portions - once signed up write the session cookie into redis alongside with the permissions of the user (what game is he allowed to follow) and hand the user off to the Node.js app. There the Node app can read the session information from Redis and serve the user.

Word of advice: You don't get scalability by simply throwing Node.js into your Toolbox. You really have to find out what Node.js is good at (low-cpu high-io concurrent operations) and how you can leverage that to remedy some of the problems your currently chosen technology has.

Solution courtesy of: Tigraine

Discussion

View additional discussion.



This post first appeared on Node.js Recipes, please read the originial post: here

Share the post

How is request processing with rails, redis, and node.js asynchronous?

×

Subscribe to Node.js Recipes

Get updates delivered right to your inbox!

Thank you for your subscription

×