For Concurrent Data Processing:
The MapReduce algorithm ideally befits any application were simultaneous data processing is required.
Protection against Hardware Failure: This is clearly understood that there is a huge bulk of data we are dealing with but what can be a matter of concern is its protection measures. Hadoop incorporates fault tolerance which basically means that even in a case of hardware failure all the data is completely protected. Another interesting fact is that when there are situations that lead to node failure all the tasks assigned to particular nodes gets transferred to other nodes and thus computing never fails.