CS–Assignment #4 Solution

$30.00 $24.00

Note 1: Your submission header must have the format as shown in the above-enclosed rounded rectangle. Note 2: Homework is to be done individually. You may discuss the homework problems with your fellow students, but you are NOT allowed to copy – either in part or in whole – anyone else’s answers. Note 3: Your…

5/5 – (2 votes)

You’ll get a: zip file solution

 

Description

5/5 – (2 votes)

Note 1: Your submission header must have the format as shown in the above-enclosed rounded rectangle.

Note 2: Homework is to be done individually. You may discuss the homework problems with your fellow students, but you are NOT allowed to copy – either in part or in whole – anyone else’s answers.

Note 3: Your deliverable should be a .pdf file submitted through Gradescope until the deadline. Do not forget to assign a page to each of your answers when making a submission. In addition, source code (.py files) should be added to an online repository (e.g., github) to be downloaded and executed later.

Note 4: All submitted materials must be legible. Figures/diagrams must have good quality.

Note 5: Please use and check the Canvas discussion for further instructions, questions, answers, and hints.

1. [20 points] Train the perceptron networks below to solve the following logical problems:

  1. [10 points] logical AND problem correctly classifying the dataset instances. Use the parameters: learning rate ɳ=0.4, initial weights = 1, and activation function = heaviside.

1

Datset

1

2

12

0

0

0

2

0

1

0

1

0

0

1

1

1

= 1

  1. [10 points] logical NOT problem correctly classifying the dataset instances. Use the parameters: learning rate ɳ=0.1, initial weights = 0, and activation function = heaviside.

1

Dataset

1

1

0

1

1

0

= 1

Solution format: Include the solution table (as illustrated in the lecture) with all variables and values calculated for each iteration. Hint: you can use an Excel spreadsheet to solve this problem and add your solution table here.

  1. [15 points] Complete the Python program (perceptron.py) that will read the file optdigits.tra to build a Single Layer Perceptron and a Multi-Layer Perceptron classifiers. You will simulate a grid search, trying to find which combination of two hyperparameters (learning rate and shuffle) leads you to the best prediction performance for each classifier. To test the accuracy of those distinct models, you will use the file optdigits.tes. You should update and print the accuracy of each classifier, together with the hyperparameters when it is getting higher.

  1. [20 points] The figure below is a network of linear neurons, that is the output of each neuron is identical to its input (linear activation function). The numbers at the connections indicate the weights of the links.

    1. [10 points] Find the value at the output nodes of the network for the given input (0.5,1).

? ?

? ? ?

    1. [5 points] Find the value at the output nodes of the network for the input (1,2). [5 points] Do you need to repeat the computations all over again? Why?

  1. [15 points] Deep Learning. Complete the Python program (deep_learning.py) that will learn how to classify fashion items. You will use the dataset Fashion MNIST, which includes 70,000 grayscale images of 28×28 pixels each, with 10 classes, each class representing a fashion item as illustrated below. You will use Keras to load the dataset which includes 60,000 images for training and 10,000 for test. Your goal is to train and test multiple deep neural networks and check their corresponding performances, always updating the highest accuracy found. This time you will use a separate function named build_model() to define the architectures of your neural networks. Finally, the weights of the best model will be printed, together with the architecture and the learning curves. To install TensorFlow use: python -m pip install –upgrade tensorflow.

5. [15 points] Consider the dataset below.

Outlook

Temperature

PlayTennis

Sunny

Hot

No

Overcast

Cool

Yes

Overcast

Hot

Yes

Rain

Cool

No

Overcast

Mild

Yes

We will apply steady state Genetic Algorithms ( = 0.5) to solve this classification problem, by using a similar approach of the Find-S algorithm. If the instance matches the rule pre-condition of the chromosome, you predict according to its post-condition, otherwise you predict the opposite class defined by the chromosome (there must be a prediction for each instance then). Follow the planning below to build your solution.

  • Representation: single if-then rule by using bit strings (binary encoding).

  1. Outlook <Sunny, Overcast, Rain> o Temperature <Hot, Mild, Cool >

  1. Examples:

  1. Outlook = Overcast → 010

  1. Outlook = Overcast ˅ Rain → 011

  1. Outlook = Sunny ˅ Overcast ˅ Rain → 111

  1. Outlook = (Overcast ˅ Rain) ˄ (Temperature = Hot) → 011100

      1. Outlook = Sunny ˄ Temperature = Hot then PlayTennis = yes → 1001001

  • Initial population (Chromosomes) : (C1=1001001, C2=0100101, C3=1011000, C4=1101100). Population size should remain the same (4 individuals) over time.

  • Fitness function: accuracy

  • Penalty criterion: no penalty.

  • Selection method: The best two chromosomes are carried over to the next generation. The other two are selected for crossover by using the roulette wheel (simulation).

  • Crossover strategy: single-point crossover with mask 1110000 in the 1st generation, two-point crossover with mask 0001100 in the 2nd generation. Use the following chromosomes to perform crossover (simulating the process of spinning the roulette wheel) according to the relative fitness

(sectors of a roulette wheel), generating two offspring for each crossover: o (1st and 3rd) in the 1st generation

o (1st and 2nd) in the 2nd generation

    • Mutation: on the 6th bit of the chromosome(s) 1011000 generated during the 2nd generation.

    • Termination criteria: accuracy = 1.0. Return the corresponding chromosome(s) – your model.

Solution format:

Fitness(C?) = ?

Fitness(C?) = ?

Fitness(C?) = ?

Fitness(C?) = ?

1st generation (C?, C?, C?, C?):

Pr(C?) = ? (4th)

Pr(C?) = ? (3rd)

Pr(C?) = ? (2nd)

Pr(C?) = ? (1st)

C? = ??????? → C? = ???????

C? = ??????? → C? = ???????

Fitness(C?) = ?

Fitness(C?) = ?

Fitness(C?) = ?

Fitness(C?) = ?

2nd generation (?, ?, ?, ?)

Pr(C?) = ? (4th)

Pr(C?) = ? (3rd)

Pr(C?) = ? (2nd)

Pr(C?) = ? (1st)

C? = ??????? → C? = ???????

C? = ??????? → C? = ???????

Applying mutation on 1011000

Fitness(C?) = ?

Fitness(C?) = ?

Fitness(C?) = ?

Fitness(C?) = ?

Final answer:

  1. [15 points] Consider the combinatorial optimization problem known as the “0-1 Knapsack problem”, where we need to fill a knapsack with objects of different weights and values. The goal is to fill the Knapsack with the highest possible value, not exceeding its maximum capacity (weight supported) and carrying only one type of each object inside. Below is the list of available objects, with their respective weights and values.

Maximum weight capacity (C) = 15 kg.

Object

Tablet

Laptop

Projector

Weight (w)

5 kg

8 kg

10 kg

Value (v)

$ 570.00

$ 710.00

$ 640.00

Apply generational Genetic Algorithms ( = 1) to solve this optimization problem strictly following the planning below.

  • Representation: binary, identifying whether the object should be included or not, with the 1st, 2nd, and 3rd positions of the chromosomes referring to the Tablet, Laptop, and Projector respectively.

  • Initial population (Chromosomes) : (C1=000, C2=001, C3=010, C4=100). Population size should remain the same (4 individuals) over time.

  • Fitness function: =1 , with n being the number of objects in the knapsack and being the number of instances of object to include in the knapsack (1, if the object is included and 0, otherwise).

  • Constraint: =1 .

  • Penalty criterion: If the maximum capacity of the knapsack (C) is exceeded, discard the obtained chromosome (offspring) and replicate the best among the two of its parents.

  • Selection method: roulette wheel (simulation)

  • Crossover strategy: single-point crossover with mask 110 in the 1st generation, single-point crossover with mask 100 in the 2nd generation. Use the following chromosomes to perform crossover (simulating the process of spinning the roulette wheel) according to the relative fitness (sectors of a roulette wheel), generating two offspring for each crossover:

      • (2nd and 3rd) and (2nd and 1st) in the 1st generation

      • (1st and 1st) and (1st and 2nd) in the 2nd generation

    • Mutation: on the 3rd bit of the chromosome(s) 101 generated during the 2nd generation.

    • Termination criteria: stop in the 3ª generation. Return the best chromosome found until then.

Summary of the optimization function:

max ∑

=1

subject to =1 ∈ {0,1}

Solution format: the same of question 5.

Important Note: Answers to all questions should be written clearly, concisely, and unmistakably delineated. You may resubmit multiple times until the deadline (the last submission will be considered).

NO LATE ASSIGNMENTS WILL BE ACCEPTED. ALWAYS SUBMIT WHATEVER YOU HAVE COMPLETED FOR PARTIAL CREDIT BEFORE THE DEADLINE!

CS–Assignment #4 Solution
$30.00 $24.00