Running hadoop, hive and mahout at the Stern Center for Research Computing
First, you must have your Stern userid enabled for hadoop To do that, please send an email to email@example.com, or call the help desk at 212-998-0180 and create a ticket for research computing.
To access hadoop,
hadoop fs -mkdir test
Should create a directory “test” in /user/yournetid (which is your default folder in the hadoop file system).
hadoop fs -lsr
and you will get a list of all of your files in hadoop
will enter the hive command line environment
will run a mahout job.
Important things to remember.
hadoop keeps all of its files in its own file system called “hdfs”. You need to move your files from linux to the hadoop files system with the
hadoop fs -put /mylocalpath/mylocalfile myhadoopfilename
command. That will copy the file at