If your application needs to be prepared not just for Big Data but for fast Big Data you need to re-think your application infrastructure from ground up. A traditional program or software will in the end run into problems with performance, because the algorithms are defined in their steps and are executed time and time again.
There is no flexibility in such program execution, to get more performance additional vertical or horizontal clustering of resources are needed. Using programming frameworks also adds lots of overhead on sometimes basic operations (like CRUD-operations), during high loads this overhead adds up.
A procedural program will inevitably grind down to a halt processing Big Data. The Big Data specialist Mike Harlow assures us the same in his book The Culture of Big Data, page 12.
A reactive application architecture solves these problems as it is radically different from traditional procedural or even object oriented programming. In reactive programming the service functions themselves can be torn apart into a loosely coupled Event Driven architecture.
Services, repositories, views and controllers are glued together using Event driven programming, where messages are mediated between sources and consumer by a messaging infrastructure. A reactive application makes the real user experience more responsive and application-like.
Many companies want access to Big Data but the querying of such data is difficult; data comes in various formats. Also most times data analysis is now days performed to explore sets of unknown data.
Glueing together data sources does give access to lots of data, but the data also needs to be searchable to make sense. Using schema-less databases is a way to solve this problem of aggregating and analyzing heterogenous data.
A loosely defined reactive architecture will also allow your network to grow as your business needs it, while maintaining performance and a great end user experience.
What is your opinion on building Reactive Big Data applications?