Decision Trees are versatile, and can be used for both regression as well as classification problems.
Decision Trees are easy to interpret, simple to understand and even debugging is easy.
Decision Trees select relevant features from the data,thus the algorithm is trained on only the optimal and selective features. The result is that it gives good accuracy.
Decision Trees have a drawback that they overfit the data, and learn the data too well. The length of the tree grows in size based on the number of features and when the depth increases the tree tends to overfit the data.
Naive Bayes does not do feature selection like decision trees
Naive Bayes are probabilistic classifiers, they are good for predicting probabilities of various classes.
Generally used for text classification and multi-class classification.In robotics and computer vision, naive Bayes is widely employed.
Decision Trees can be used as a research tool to learn about the data.e.g. Choosing which product to manufacture, choice based decisions and other research based decisions can be handled with decision trees.
But Naive Bayes does not overfit the data, Thus no pruning is required; it is simple as compared to decision trees and also easier to interpret.
Decision Trees are used with large datasets while Naive Bayes works well with smaller data sets.