Starting from:

$28

SOLVED LING572 Hw10: NN and Backpropagation

Q1 (25 points): Run hw10.sh with different config file settings (i.e., changing the values in config.yml) and fill in Table 1. The activation value in the config file should be set to 0 (for sigmoid function). For the learning rate, keep it as 0.5. Table 1: Classification accuracy with sigmoid activation function Expt # of # of neurons in # of mini-batch test CPU time id hidden layer hidden layers epoches size accuracy (in minutes) 1 1 30 30 10 2 1 30 30 50 3 1 30 100 10 4 1 60 30 10 5 2 30, 30 30 10 6 2 40, 20 30 10 7 3 20, 20, 20 30 10 Q2 (50 points): Modify network.py and config.yml under that directory so that the new code will use tanh when activation value in the config file is set to 1. For the learning rate, keep it as 0.5. • Fill out Table 2, which is the same as Table 1, except that it uses tanh as the activation function. • In the readme.[txt | pdf], explain which functions (or which lines) in which file(s) you have changed. 1 Table 2: Classification accuracy with tanh activation function Expt # of # of neurons in # of mini-batch test CPU time id hidden layer hidden layers epoches size accuracy (in minutes) 1 1 30 30 10 2 1 30 30 50 3 1 30 100 10 4 1 60 30 10 5 2 30, 30 30 10 6 2 40, 20 30 10 7 3 20, 20, 20 30 10 • Submit the modified python code. Please keep the file names unchanged. Submission: Submit the following to Canvas: • Your note file readme.(txt | pdf ) that includes Tables 1 and 2, and any notes that you want the TA to read. • hw.tar.gz that includes all the files specified in /dropbox/18-19/572/hw10/submit-file-list, plus any source code (and binary code) used by the shell scripts. • Make sure that you run check hw10.sh before submitting your hw.tar.gz. 2