Try these steps (make necessary changes):
First upload the dataset file in HDFS
hdfs dfs -put custs --->ENTER
First we have created a new table in HBase,
Open hbase
hbase shell --->ENTER
Now create a table, you can give your own name, but don't forget to change the table name in the HBase bulk load command,
create 'customerNew','info' --->ENTER
quit --->ENTER
Now, execute the following command. please make sure that you are providing your table's name
HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase classpath` hadoop jar /opt/cloudera/parcels/CDH/lib/hbase/hbase-server-1.2.0-cdh5.11.1.jar importtsv -Dimporttsv.separator=, -Dimporttsv.bulk.output=output -Dimporttsv.columns=HBASE_ROW_KEY,info:id,info:fname,info:lname,info:age,info:prof customerNew custs --->ENTER
Now, execute the following, please write your table's name
HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase classpath` hadoop jar /opt/cloudera/parcels/CDH/lib/hbase/hbase-server-1.2.0-cdh5.11.1.jar completebulkload output customerNew --->ENTER
Now, open hbase shell again
habse shell --->ENTER
And fire the below command, please use your table's name, and you'll be able to see the data loaded into the table.
scan 'customerNew' --->ENTER