Cloud Computing for Data Analysis Group Activity 04 - PART2: Download MapReduce Action Rules software and run on Hadoop Get Car Evaluation Data from: http://webpages.uncc.edu/aatzache/ITCS6162/Project/Data/CarEvaluationData/CarData.zip Get Mammographic Mass Data from: http://webpages.uncc.edu/aatzache/ITCS6162/Project/Data/MammographicMassData/MammData.zip Replicate both data 1024 times. Templates of all data files are given in the 'Data' folder in the .zip file that we download from the below link. Execute the code for both datasets. Download source code here: http://webpages.uncc.edu/aatzache/ITCS6162/Project/StudentPrograms/MR_RandomForestExample_01.zip To run the Code: 1. Copy the attached ActionRules.jar file to your WinSCP or CyberDuck 2. Template of attributes and parameters file to be given to the code are given in attributes.txt and parameters.txt respectively 3. Create attributes.txt, parameters.txt in the given template for the respective dataset and copy to DSBA-HDFS using: hadoop fs -put /users/YOURUNCC_USERNAME/file_name.txt /user/YOURUNCC_USERNAME/ 4. Run the .jar file using your terminal or Putty using following command: hadoop jar path-to-jar-file snippet.Main path-to-attributes-file path-to-data-file path-to-parameters-file path-to-ActionRules-Output Minimum-Support Minimum-Confidence 5. Get the output using following command: hadoop fs -get HDFS_OutputPATH /users/abagavat/MRRandomForestOutput.txt (if the output directory contains just a single file) hadoop fs -getmerge HDFS_OutputPATH /users/abagavat/MRRandomForestOutput.txt (if the output directory contains just a multiple files) 6. Upload MRRandomForestOutput.txt along with text from the terminal window to Canvas