Massive parallel processing in cloud big data
Have you ever wondered - How Pyramid of Giza was built? What it takes to run a chip? What's the universe made up of?
Let us throw some light on this. The basic fabric of nature is that it is made up of building blocks that needs to be atomic in nature and should be available in plenty. The same applies to our computing world. But there is a certain limit to scaling up, since the cost to overhead maintenance increases simultaneously. This directly impacts the scale up advantage in the first place and is nullified by the corresponding increase in maintenance overhead.
Hence the way forward is nature's path itself, which is the theory of 'Scaling Out'. A concept which states when many atomic computing resources are leveraged in parallel, the impact can be massive. And that's what we call, MPP - Massive Parallel Processing.
Further to this, in a more simplified way it is splitting up simpler but larger tasks into multiple buckets and getting those buckets processed at the same time to gear up the efficiency and the speed which are the incumbent areas to conquer in this competing era.
With each passing day internet infrastructure efficiency is burgeoning in terms of payload and speed. Now this piques the need for MPP in Cloud. Unlimited atomic computing resource availability (literally) on cloud from many vendors, is another thumbs up for MPP in Cloud to become a reality.
To summarize or conclude - MPP in cloud needs 'scale out' architecture paradigm, to be much more effective.
Big Data concept is a reality - "Think Scale Out"