37081/how-to-decide-number-of-mappers
While running a mapreduce job in Hadoop, how to decide how many mappers are needed?
The number of mapper depends on the total size of the input. i.e. the total number of blocks of the input files.
Mapper= {(total data size)/ (input split size)}
If data size= 1 Tb and input split size= 100 MB
Then, Mapper= (1000*1000)/100= 10,000
The map tasks created for a job ...READ MORE
Total number of slave nodes in your ...READ MORE
Yes. you can use the hadoop fsck command to do ...READ MORE
SELECT a_id, b, c, count(*) as sumrequests FROM ...READ MORE
Firstly you need to understand the concept ...READ MORE
org.apache.hadoop.mapred is the Old API org.apache.hadoop.mapreduce is the ...READ MORE
Hi, You can create one directory in HDFS ...READ MORE
In your case there is no difference ...READ MORE
COUNT is part of pig LOGS= LOAD 'log'; LOGS_GROUP= ...READ MORE
You can count the number of lines ...READ MORE
OR
At least 1 upper-case and 1 lower-case letter
Minimum 8 characters and Maximum 50 characters
Already have an account? Sign in.