questions/big-data-hadoop/page/10
Suppose you need to load this in ...READ MORE
I've gone through a process that is ...READ MORE
Hi, The number of map tasks for a ...READ MORE
Hey, You can load data from flat files ...READ MORE
FILELDS TERMINATED BY does not support multi-character delimiters. ...READ MORE
Try -copyFromLocal command READ MORE
Hello, If you want to see the content ...READ MORE
Failure of the resource manager is serious ...READ MORE
Suppose you want to kill the jobs ...READ MORE
Try removing the / before the input ...READ MORE
Hi, You can do one thing: Create namenode dir with ...READ MORE
Hey, Grunt shell is a shell command.The Grunts ...READ MORE
Hey, I guess there is no IP address ...READ MORE
This error is thrown when the parameters ...READ MORE
Hey, Create a znode with the given path. The flag argument ...READ MORE
Hi, In this case, it is searching ...READ MORE
Each reducer uses an OutputFormat to write ...READ MORE
With dev tools you can install directly ...READ MORE
We have to use Sqoop-HCatalog Integration here. ...READ MORE
Hi, You can follow the below-given solution. Just enter ...READ MORE
The default directory of Hadoop log file ...READ MORE
Hi, I want to write CCA-175 (CCA Spark ...READ MORE
In the datanodes, open the hdfs-site.xml file and add ...READ MORE
Hey, Yarn is configured with the "zero-configuration failover". So ...READ MORE
Yes, you can find out which database ...READ MORE
This seems like a path issue. Add the ...READ MORE
It is because the parent directories do ...READ MORE
Please try the below command: sqoop job -create ...READ MORE
I am not sure if you want ...READ MORE
You can add this below property in oozie-site.xml: <property> <name>oozie.service.HadoopAccessorService.jobTracker.whitelist</name> <value>myaddress:8020</value> </property> Hope ...READ MORE
If your dataset is in the FTP ...READ MORE
The easiest way is using the following ...READ MORE
this is my input file output_pig_group_education_comma/input_load.txt and its ...READ MORE
If we get the TimeOut Exception, then ...READ MORE
In NFS, the data is stored only ...READ MORE
I hope I understood your query properly I ...READ MORE
Hey, You can check this command also hadoop fs ...READ MORE
Follow these steps: STEP 1 : stop hadoop ...READ MORE
Both codes contain different API of Map ...READ MORE
You can use these commands. For namenode: ./hadoop-daemon.sh start ...READ MORE
It allows to run dfs commands more ...READ MORE
You will have to start pig first ...READ MORE
Hey, The rerun option reruns a terminated (=TIMEDOUT=, SUCCEEDED, KILLED, ...READ MORE
The general syntax to do this as ...READ MORE
i want replicate task of mapper but ...READ MORE
Enter the below command in the terminal ...READ MORE
Hi, Since the table being dropped does not ...READ MORE
Hi, Namenode generates new namespaceID every time you ...READ MORE
Hey, You can do one thing, open the file ...READ MORE
You can use the following sample code for ...READ MORE
OR
At least 1 upper-case and 1 lower-case letter
Minimum 8 characters and Maximum 50 characters
Already have an account? Sign in.