Map/Reduce model is the today’s answer to the big data challenges. While perfectly suitable in most cases, the model is not as good when analysis must proceed in the opposite direction.
When an “issue” is identified in the course of an analysis of a large data set a question may arise: what changes in the data will make the issue go away? The question is typically a hard one. It leads to myriad of possibilities – a combinatorial explosion.
To deal with the combinatorial explosion associated with backward reasoning, the analytic engine must be able to quickly add computing resources and release them as soon as the satisfactory solution is found. Sounds just like Erlang on Xen advert.