questions/big-data-hadoop/page/26
You need to create a new user ...READ MORE
The reason you are getting hadoop as ...READ MORE
I think you are using new version ...READ MORE
Here's where you can find the file: /etc/hadoop/[service ...READ MORE
Hey. It's definitely not a stupid question. ...READ MORE
Make sure you are running from the ...READ MORE
The Secondary namenode is mainly used as a ...READ MORE
You can create a file directly in ...READ MORE
Incremental append or load in sqoop will ...READ MORE
Seems like a hive version problem. insert operation is ...READ MORE
You have to add the partition before ...READ MORE
Fair Scheduling is the process in which ...READ MORE
No, the files after the reduce phase are ...READ MORE
Hey George! This error comes whenever you use ...READ MORE
The default port number for MySQL is ...READ MORE
Hey @Mohan! Could you try the following ...READ MORE
hadoop fs -cat /example2/doc1 | wc -l READ MORE
In HA (High Availability) architecture, we have ...READ MORE
There are many sites you can get ...READ MORE
Seems like your system does not have ...READ MORE
Try this: stop all the daemons: ./stop-all.sh format the namenode: cd ...READ MORE
You can use the FileUtil api to do this. Example: Configuration ...READ MORE
Never mind. I forgot to run hadoop namenode ...READ MORE
Try adding <property> <name>dfs.name.dir</name> <value>/path/to/hdfs/dir</value> ...READ MORE
Logs are distributed across your cluster, but ...READ MORE
The error which you are getting can ...READ MORE
Seems like Firewall is blocking the connection. ...READ MORE
It can be controlled by setting the ...READ MORE
While running Scala, Scala objects are translated ...READ MORE
Run the command as sudo or add the ...READ MORE
First, format the namenode and then try ...READ MORE
There are two possible reasons for this: Wrong ...READ MORE
Below are the versions which can be used ...READ MORE
You have forgotten to include the package name ...READ MORE
You can see the free available space ...READ MORE
In hadoop, we do not create different ...READ MORE
You have to write this directory in ...READ MORE
Make the following changes to the hadoop-env.sh ...READ MORE
Make sure you have built Nutch from ...READ MORE
You can use the SPARK_MAJOR_VERSION for this. Suppose ...READ MORE
Hi. This is the code I used ...READ MORE
You can increase the threshold in yarn-site.xml <property> ...READ MORE
1)Family Delete Marker- This markers marks all ...READ MORE
Yes, you can update the data before ...READ MORE
type jps and check whether namenode and datanode is ...READ MORE
To rectify this errors, you need to ...READ MORE
You can use the DESCRIBE command to ...READ MORE
sudo service mysqld restart mysql -u <username> root ...READ MORE
You can use the hdfs command: hdfs fs ...READ MORE
OR
At least 1 upper-case and 1 lower-case letter
Minimum 8 characters and Maximum 50 characters
Already have an account? Sign in.