There are different ways to do this. I've mentioned a few here:
a) Distcp is one of the best options for data transfer between cluster for HDFS
https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.6/bk_Sys_Admin_Guides/content/using_distcp.html
b) Hbase follow below link it has two method.
- Using Distcp for a full dump
- SnapShot database for Hbase
c) Hive Metastore, check what type of database using like Mysql, take full export of Mysql database as below.
For full dump, you can use "root" user
mysqldump -u [username]-p [password][dbname]> filename.sql
And if you wish to zip it at the same time:
mysqldump -u [username]-p [password][db]| gzip > filename.sql.gz
You can then move this file between servers with:
scp user@xxx.xxx.xxx.xxx:/path_to_your_dump/filename.sql.gz your_detination_path/
Once copied import the all objects to my sql database and start the hive server