questions/big-data-hadoop/page/30
You can do the following method, copy to ...READ MORE
This is happening because the file name ...READ MORE
LocalFS means it may be your LinuxFS ...READ MORE
It seems like you are missing a ...READ MORE
Write Ahead Log (WAL) is a file ...READ MORE
Please type the below command in the ...READ MORE
For SQOOP export please try below command: bin/sqoop ...READ MORE
The syntax for Map-side join and Reduce-side ...READ MORE
The error you are getting is because ...READ MORE
The Cloudera Connect Partner Program, more than ...READ MORE
Partitioning: Hive has been one of the preferred ...READ MORE
If the downtime is not an issue, ...READ MORE
As per Cloudera, if you install hadoop ...READ MORE
Usually we have Map/Reduce pair written in ...READ MORE
The combiner class is not required in ...READ MORE
Let's understand full write mechanism so that ...READ MORE
impala-shell -i <domain_name>: <port > READ MORE
Shared FS Basically implies to the high ...READ MORE
Yes, You need to mention the below ...READ MORE
Try this: String cords = it.next().toString(); lattitude = Double.toString((inst.decode(cords))[0]); longitude ...READ MORE
It's simple. You just have to add external ...READ MORE
Seems like the content in core-site.xml file is ...READ MORE
You could pass the URI when getting ...READ MORE
In your code, you have set some ...READ MORE
The official location for Hadoop is the ...READ MORE
Here's another link from Hadoop which may ...READ MORE
Stop all running server 1) stop-all.sh Edit the ...READ MORE
The issue which you are facing is ...READ MORE
You can use a Writable, something like ...READ MORE
Here is an example of import command. ...READ MORE
Use org.apache.hive.jdbc.HiveDriver as your driver ...READ MORE
Step 1: Give the below command to ...READ MORE
You can provide security to cloudera hdfs by using HDFS ...READ MORE
You can not directory use files from ...READ MORE
For UBUNTU Hosts File and other configuration for Hadoop ...READ MORE
you need both core and SQL artifacts <repositories> ...READ MORE
Different relational operators are: for each order by fil ...READ MORE
This is what happens: Map reduce framework will ...READ MORE
SELECT hash_id, COLLECT_LIST(num_of_cats) AS ...READ MORE
Yes, Hadoop 1.0 didn't have standby namenode. ...READ MORE
Check if bin/start-all.sh doesn't override JAVA_HOME put echo ...READ MORE
I doubt if there is something which ...READ MORE
conf.set("key.value.separator.in.input.line", ","); Job job = new ...READ MORE
This seems like a problem with the ...READ MORE
You can run sqoop from inside your ...READ MORE
Unfortunately the command that you are giving ...READ MORE
No, Data-Locality concept applies to MAPPERS only. Reducers ...READ MORE
The total number of files in the ...READ MORE
Try this command hadoop dfs -put /var/tmp/students.txt / hadoop ...READ MORE
OR
At least 1 upper-case and 1 lower-case letter
Minimum 8 characters and Maximum 50 characters
Already have an account? Sign in.