1)Start-all.sh -to start all
2) ->Hadoop
3) Hadoop
namenode –format
To start
hdfs
1)
Start-dfs-sh
to start hadoop i.e name node,data node ,secondary node
2)
Start-mapred.sh
to start job tracker and task tracker
Admin
commands
Dfsadmin
Mradmin
Fsck
Balancer
Distep
Fs
Pipes
Job
Queue
Jar
Developer
commands
1)
Hadoop
fs
2)
Hadoop
fs –ls /
3)
Hadoop
–lsr
4)
Hadoop
fs –mkdir /newfolder
5)
Hadoop
fs –ls foldername
6)
Hadoop
fs – put /etc/hostname /etc/hosts/kalian
7)
Jps
8)
Hadoop
fs –cp /folder/host1 /folder/host2
9)
put
– local to hdfs
10)
cp,mv
– hdfs to hdfs
11)
get
– hdfs to local
12)
rm
13)
hadoop
fs –chown - r (read permissions to other user)
14)
hadoop
fs –cat
15)
hadoop
fs –text
16)
hadoop
fs –du
17)
hadoop
fs –dus
18)
hadoop
–help
No comments:
Post a Comment