I see that you are confused about the components of Hadoop. Hadoop is not bound to just any combination of HDFS + MapReduce or HDFS + Spark.
It has many components in it and any combination of the components is considered as Hadoop. To be more elaborate,
Hadoop is actually a framework or software utility used to process a huge collection of data using cluster computing mechanism where data which needs to be processed is stored amongst distributed storage devices and in addition to that the storage system is fault tolerant as the data stored is replicated stored into multiple blocks so that even at the instance of data loss or the storage unit failure, the next data storage unit which has a similar copy acts as a backup.