2023-Q3-AI 4. Regression, Classification - Pytorch

 

4.1. Video / Materials

Video (19 Jul 2023, 13:00):

https://youtube.com/live/EvNawwH_mYU?feature=share

Jamboard: https://jamboard.google.com/d/1jEqwMfh9FZBYE_H2gYX-BUyjqMOlCl_54p6hk-5KVo4/edit?usp=sharing

Materials: https://pytorch.org/tutorials/beginner/nlp/deep_learning_tutorial.html

 


 

Priekš Youtube Live, izmantojot OBS streaming software

Youtube key: 6z53-3r17-a8tu-u50j-1xgq (cits key) RTMP: rtmp://a.rtmp.youtube.com/live2


 

 

Iepriekšējā gada

Video: https://youtu.be/Ejv260iiVfo

Jamboard: https://jamboard.google.com/d/1iiLPYbVyYwc9RM35fSTUWXH0gvh6POnZ73hveOeT7mk/edit?usp=sharing

 

 


 

4.2. Implement a PyTorch version for the regression task

Use the reference video 6.1 and the template. Implement:

  1. Create an Embedding Matrix for categorical input data.

  2. Implement the MSE error function.

  3. Implement Huber loss as an error function http://share.yellowrobot.xyz/upic/753bc664e86ca87b3c714d4215254851_1676800801.jpg

Submit a screenshot with results and source code!

Template:

http://share.yellowrobot.xyz/quick/2023-2-19-7A72ACC5-39F4-44EB-8223-374D2FF9ED8D.zip


4.3. Implement a PyTorch version for the classification task

Tasks:

  1. Implement Cross-Entropy Loss (must not use the built-in loss function) - http://share.yellowrobot.xyz/upic/c83a2498563001a2e30acb1f40c97aa6_1676800812.jpg

  2. Implement the Accuracy metric and chart - http://share.yellowrobot.xyz/upic/fcec10b94af5e36ab374e8d29e6811d6_1676800827.jpg

  3. Implement the Confusion matrix (https://towardsdatascience.com/confusion-matrix-for-your-multi-class-machine-learning-model-ff9aa3bf7826)

  4. Implement the F1-score metric - http://share.yellowrobot.xyz/upic/cb03b2e54de0e41873f94cf6ab81ed72_1676800842.jpg

Submit a screenshot with results and source code!

Template:

http://share.yellowrobot.xyz/quick/2023-2-19-13E1F9A9-F0AF-4036-A1C0-DE62F27E0FF3.zip


4.4. Implement a PyTorch version for the binary classification task

Tasks:

  1. Implement Binary-Cross-Entropy Loss (must not use the built-in loss function) http://share.yellowrobot.xyz/upic/8c77d1b6f1d5e3c44febd63f0fecb9fd_1676800928.jpg

  2. Implement the BatchNormalization function in PyTorch http://share.yellowrobot.xyz/upic/8b4284380b4fc8c31ce424ba10f9da4f_1676800938.jpg

Submit a screenshot with results and source code!

Template:

http://share.yellowrobot.xyz/quick/2023-2-19-9B05C46C-8CB9-4228-A190-C77F6218B051.zip


4.5. Homework - Implement a numpy version for the classification task

Tasks:

  1. Implement the SoftMax function and its derivative http://share.yellowrobot.xyz/upic/7cea763fedc5123ae886830fe3622c82_1676801194.jpg

  2. Implement the CCE loss function and its derivative - http://share.yellowrobot.xyz/upic/7389daf82d3fee898d5e87e68d98cb70_1676800994.jpg

Submit a screenshot with results and source code!

Template:

http://share.yellowrobot.xyz/quick/2023-2-19-0BCA422E-0120-4165-89DD-A4E8C2DBB024.zip

 


 

 

image-20230214111343931

 

Accuracy formula

acc=1Ncorrectall

Huber loss

Lhuber(y,y)=1Nδ2(1+(yyδ)21)

 

CCE formula

LCCE(y,y)=1Nwnylog(y+ϵ)

 

F1 formula

F1=1N2TP2TP+FP+FN

 

BCE formula

LBCE(y,y)=1Nwnylog(y+ϵ)+(1y)log(1y+ϵ)

 

BatchNormalization formula

image-20220721223940721

 

Softmax formula derivative

SoftMax(x)=exik=1Kexk=[a0a1a2a3a4]SoftMax(x)x=[a0(1a0)a0a1a0a2a0a3a0a4a1a0a1(1a1)a1a2a1a3a1a4a2a0a2a1a2(1a2)a2a3a2a4a3a0a3a1a3a2a3(1a3)a3a4a4a0a4a1a4a2a4a3a4(1a4)]

 

CCE formula derivative

 

LCCE(y,y)=1Nylog(y+ϵ)LCCE(y,y)y=1Ny1y=1Nyy=1Nyy+ϵ

 

 


image-20220718235031878

image-20220317104105565

 

image-20220317104121539

 

softmax_1

softmax_0

 

hardmax

normmax

softmax

compare_softmax

 

image-20220317192528967