{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Convolutional Neural Networks and Image Recognition"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Goals\n",
"\n",
"* Understand the differences between a standard convolution layer in a convolutional neural network and the traditional 2D image processing convolution we've discussed thus far:\n",
" * It's technically cross-correlation instead of convolution (why bother flipping the kernel if the weights are learned?)\n",
" * There are Separate weights per channel\n",
" * Each filter sums across channels, and produces a single channel of the output feature map\n",
"* Know the meaning of a few of the variations possible with convolution layers:\n",
" * Stride (where the window slides by more than 1 pixel at a time)\n",
" * Bias (where a separate scalar parameter is added to the filter's output channel, similar to a bias in a linear layer)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
" ## Convolutional Layers and Convolutional Networks: A Quick Primer"
]
},
{
"cell_type": "markdown",
"metadata": {
"colab_type": "text",
"id": "CJ1zvCUxhz_Y"
},
"source": [
"\n",
"Convolutional Neural Networks (CNNs) are neural networks where some (many) of the layers perform **convolutions** instead of matrix multiplication (as in a linear (aka fully-connected) layer. I'll go through a basic introduction to Convolution layers and Convolutional Neural Networks, \n",
"\n",
"If you want a deeper and more thorough presentation, you can check out the following videos:\n",
"\n",
"* C4W1L06 - Convolutions Over Volumes https://www.youtube.com/watch?v=KTB_OFoAQcc\n",
"* C4W1L07 - One Layer of a Convolutional Net https://www.youtube.com/watch?v=jPOAS7uCODQ\n",
"* C4W1L08 - Simple Convolutional Network Example https://www.youtube.com/watch?v=3PyJA9AfwSk, but don't get bogged down in the notation details.\n",
"* If you want to see a friendly introduction to Max Pooling layers, also check out this one: C4W1L09 Pooling Layers https://www.youtube.com/watch?v=8oOgPUO-TBY\n",
"\n",
"Some non-video animation resources:\n",
"\n",
"* Here are some fun animations: \n",
"* and here are some older-school animations; these reduce things to 2D but might be easier to grok in some cases, and they include \"deconvolution\": "
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Todo:\n",
"* Learn you a filter!\n",
"* Cross-correlation instead of convolution\n",
"* Multi-channel convolutions: image processing vs CNN layer\n",
"* Summing across channels\n",
"* Multiple filters\n",
"\n",
"Fancier:\n",
"* Strided convolution\n",
"* Convolution with bias\n",
"\n",
"CNN architectures considerations:\n",
"* Spatial vs channel dimension\n",
"* Downsampling mechanics (max pool, strided, bilinear)"
]
},
{
"cell_type": "markdown",
"metadata": {
"colab_type": "text",
"id": "CJ1zvCUxhz_Y"
},
"source": [
"## CNN Architecture By Example: LeNet on MNIST\n",
"Now, we'll use this notebook to explore the use of CNNs on a \"small\" dataset of handwritten digits. \n",
"\n",
"By the end of this activity, you should be able to:\n",
"* Know some of the typical architectural features of CNNs"
]
},
{
"cell_type": "markdown",
"metadata": {
"colab_type": "text",
"id": "CJ1zvCUxhz_Y"
},
"source": [
"An early success case for CNNs was a network called LeNet (after Yann **Le**Cun), which performed well on a dataset of handwritten digits called MNIST.\n",
"\n",
"In this notebook you will experiment with different architectures on the MNIST dataset to get a feel for how CNNs work. With modern techniques and compute, this dataset is considered a toy dataset, but it's still an interesting testing ground for architecture ideas. In particular, we're going to look at it through the lens of how many **parameters** we need to learn to do well on the dataset.\n",
"\n",
"### You'll Need a GPU\n",
"\n",
"To train the models in this notebook, you'll want to be on a machine with an NVIDIA GPU. Please see the [Project 4 instructions](https://www.cs.middlebury.edu/~swehrwein/cs1053_26w/p4/#overview) for details on how to do this."
]
},
{
"cell_type": "markdown",
"metadata": {
"colab_type": "text",
"id": "XHLKy2Ekqoq4"
},
"source": [
"#### Useful functions\n",
"\n",
"Run the below cell to define functions that will be used later"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {
"colab": {
"autoexec": {
"startup": false,
"wait_interval": 0
}
},
"colab_type": "code",
"id": "XPWrOExkqysJ"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"The autoreload extension is already loaded. To reload it, use:\n",
" %reload_ext autoreload\n",
"cpu\n"
]
}
],
"source": [
"%load_ext autoreload\n",
"%autoreload 2\n",
"\n",
"import torch\n",
"import torch.nn as nn\n",
"import torch.nn.functional as F\n",
"import torch.optim as optim\n",
"import torchvision\n",
"from torch.autograd import Variable\n",
"\n",
"import urllib\n",
"import cv2\n",
"import numpy as np\n",
"import os, sys, math, random, subprocess\n",
"import matplotlib.pyplot as plt\n",
"from scipy.ndimage import gaussian_filter\n",
"from IPython.display import clear_output, Image, display, HTML\n",
"from io import StringIO\n",
"import PIL.Image\n",
"\n",
"src_path = os.path.abspath(\"../src\")\n",
"if (src_path not in sys.path):\n",
" sys.path.insert(0, src_path)\n",
" \n",
"import ML\n",
"\n",
"def get_n_params(module):\n",
" nparam = 0\n",
" for name, param in module.named_parameters():\n",
" param_count = 1\n",
" for size in list(param.size()):\n",
" param_count *= size\n",
" nparam += param_count\n",
" return nparam\n",
"\n",
"def get_model_params(model):\n",
" nparam = 0\n",
" for name, module in model.named_modules():\n",
" nparam += get_n_params(module)\n",
" return nparam\n",
"\n",
"def to_numpy_image(tensor_or_variable):\n",
" \n",
" # If this is already a numpy image, just return it\n",
" if type(tensor_or_variable) == np.ndarray:\n",
" return tensor_or_variable\n",
" \n",
" # Make sure this is a tensor and not a variable\n",
" if type(tensor_or_variable) == Variable:\n",
" tensor = tensor_or_variable.data\n",
" else:\n",
" tensor = tensor_or_variable\n",
" \n",
" # Convert to numpy and move to CPU if necessary\n",
" np_img = tensor.cpu().numpy()\n",
" \n",
" # If there is no batch dimension, add one\n",
" if len(np_img.shape) == 3:\n",
" np_img = np_img[np.newaxis, ...]\n",
" \n",
" # Convert from BxCxHxW (PyTorch convention) to BxHxWxC (OpenCV/numpy convention)\n",
" np_img = np_img.transpose(0, 2, 3, 1)\n",
" \n",
" return np_img\n",
"\n",
"def normalize_zero_one_range(tensor_like):\n",
" x = tensor_like - tensor_like.min()\n",
" x = x / (x.max() + 1e-9)\n",
" return x\n",
"\n",
"def prep_for_showing(image):\n",
" np_img = to_numpy_image(image)\n",
" if len(np_img.shape) > 3:\n",
" np_img = np_img[0]\n",
" np_img = normalize_zero_one_range(np_img)\n",
" return np_img\n",
"\n",
"def show_image(tensor_var_or_np, title=None, bordercolor=None):\n",
" np_img = prep_for_showing(tensor_var_or_np)\n",
" \n",
" if bordercolor is not None:\n",
" np_img = draw_border(np_img, bordercolor)\n",
" \n",
" # plot it\n",
" np_img = np_img.squeeze()\n",
" plt.figure(figsize=(4,4))\n",
" plt.imshow(np_img)\n",
" plt.axis('off')\n",
" if title: plt.title(title)\n",
" plt.show()\n",
"\n",
"device = torch.device(\"cuda\" if torch.cuda.is_available() else \"cpu\")\n",
"print(device)"
]
},
{
"cell_type": "markdown",
"metadata": {
"colab_type": "text",
"id": "CJiMTcx9qr5Z"
},
"source": [
"## Training Data\n",
"\n",
"We will use the [MNIST handrwritten digit dataset](https://www.kaggle.com/datasets/hojjatk/mnist-dataset) to train our neural network models. There is a simple wrapper for the MNIST dataset in the torchvision package that implements the Dataset class. We will use that in conjunction with the DataLoader to load training data. Run the below cell to download and initialize our training and test datasets. You should see an example batch of images and their labels shown."
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {
"colab": {
"autoexec": {
"startup": false,
"wait_interval": 0
}
},
"colab_type": "code",
"id": "hURbcBfwqUrY"
},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"100%|██████████████████████████████████████████████████████████| 9.91M/9.91M [00:00<00:00, 32.5MB/s]\n",
"100%|██████████████████████████████████████████████████████████| 28.9k/28.9k [00:00<00:00, 1.36MB/s]\n",
"100%|██████████████████████████████████████████████████████████| 1.65M/1.65M [00:00<00:00, 12.1MB/s]\n",
"100%|███████████████████████████████████████████████████████████| 4.54k/4.54k [00:00<00:00, 823kB/s]\n"
]
},
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAUoAAACwCAYAAABpVyfOAAAAOnRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjEwLjgsIGh0dHBzOi8vbWF0cGxvdGxpYi5vcmcvwVt1zgAAAAlwSFlzAAAPYQAAD2EBqD+naQAAUQJJREFUeJzt3Qm8rlVVP/Dn3ntyyDTIoBTillqZGSWQZpOGsymaAhIBKiIiKlggozHKDIoMogyCpSAiKg6YZHGby0ppQDPNBistUxosGyj6fPffdT5P7/99z/uee569n33u3evzeT/n3nPPPc9+9l7rt+a1N9xzzz33dI0aNWrUaCZtnP1PjRo1atSoAWWjRo0aLUDNomzUqFGjOdSAslGjRo3mUAPKRo0aNZpDDSgbNWrUaA41oGzUqFGjOdSAslGjRo3mUAPKRo0aNZpDS92CdNpppy36o40aNWq0LmhRXFsYKIP+7u/+rvurv/qrrhbavHlzt/POO3f/8z//0330ox/taunIvPe9793tvvvu6c9/8Rd/0X3hC1/oaqGHP/zh3f3vf//uP/7jP7o//MM/7GqhBzzgAd13fud3pj//yZ/8Sfcv//IvXS30vd/7vd297nWv7p//+Z+7T37yk10thPfJAPqDP/iD7j//8z+7Gmjjxo3dox71qPT17//+77u//Mu/7GrDjNXQqoGS0P/CL/xCVwv9+I//eHrp//7v/+4++MEPJsCsgb7+679+GSj/6I/+KIF4LWS/AOVXvvKV7tZbb+1qoW/91m9dBsrf+q3fqkohWxeg/OIXv1jVnu21117LQPlLv/RL1SiXTZs2JeUCKGEG2ayFnv70p68aKFuMslGjRo3mUAPKRo0aNRra9d6W6H73u1/3Az/wA93XfM3XdA984AO7Bz3oQd23fdu3pX/7vd/7ve5v/uZv0ufOO+8ce6mNGjUakbZboPzGb/zGlNR47nOfm2JP3/zN35xiZI94xCPSv//yL/9ySijccccd3ac//emU+KiJ7nvf+3bf8z3f0/3rv/5rCpaPkSwSf/qGb/iGbqeddkqfL3/5yymu+I//+I/d3XffXXw9jRrlou0SKAWaf+zHfqw77LDDuic+8Ynpe7LlPv/1X//VbdiwoXv84x+fgtEszA9/+MMpa1dLRt36d9ttt+76669PiSJfb7rpplEy+4973OO6Aw44oPuJn/iJlEE/55xzui1btowC3M7N3qBFgNrP13CmsW6KR1LSpxTxpuyBJOhqE6EbNmxIX2vYw9y0cXutnTrhhBMSWAb97d/+bffrv/7r3ZVXXpmy+qyjHXfcsXvkIx/Z7bfffomhaiFCJaMe5Sr/8A//MJpV/sM//MPdM5/5zCQsLFwKJrL9pem7v/u7u6uvvrp77WtfO/dneRBAftddd10G17HoR37kR7rrrrsuKb0XvehF3X3uc5/sz8Q7lO373ve+7txzz+2e9axnpdDTonT/+98/ha2c9dLStm9vbftv2CMa+2EPe1gS7oc+9KFJc3MVzzrrrASUX/rSl7q77rorMcFFF120zAQ77LBDVxMR7G//9m9P4PSpT32q+9jHPjZK+dPhhx+ehDyApm8ZjUF77rln9+AHP3ihWkLlIUCdRwFYnX1psl947dBDD+1+6Id+qPumb/qm7uijj+7e+c53pnfIVeoGiIEcY2GPPfbo/u3f/q37nd/5nVT6NI82btyYYvuXXnppAlplXJdddln3uc99bvB14iX7c8ghhySFdvvtt3e33XZbUYs7G1DusssuiVlZG+r0vNTnP//5FLf693//91HrvDAmZiQkDuGzn/1sclk/9KEPpfVhTuCDeViQmML3gGgt7gVLQDwQo2POiAmWLAoXjhDLZZFHDV8QKzxivZSO+GkJxnZWLFox00WKm50nHvj+7//+7uu+7uvSWkufsTUDGwoZXwaPAshcaxEuoVCe8pSnJB5S4/iRj3yk+7M/+7OFFMz97ne/7jGPeUw6+7/+679OnleOIvfIGShaf/azn51klhwCy20CKBXAhvvFJbSJcRC09mc+85mFfo/NAKySKEMmBqJo2Lp0Mrz+9a9P8bRgTEmSffbZp3vIQx6SBAjIy3rXUsj+tV/7tWlte++9d/f2t7+9eFE290wCzPMBk/X0CXP/4A/+YFKUn/jEJ1K8Ungg5/4BF+sQU3Z+iyTeWFG6zL7lW74l/V+eg/h0SfJMSpniY+Xhw1/91V9NfJ8LKHlHz3jGMxKPkwXF89x+fL4IsO+8884pFEVhOltJz0Us0dWQ58APzSR4DWg6Y0rZXqlEKS2PgwMlN4Jr23dXDzzwwAR2MrQyyYswgYP7xV/8xXQY/s8QBHx1L0S5D2EhxH1iSWIk6x/LhVyJgDeLDiDYGwqoJKkUOOmkk2aeIcFnBWBkFscrX/nK7nd/93cTKOUiQPfYxz42WUosjt/8zd+c+3+ABOtTCIN7Vxoo8ZbnvuxlL0vrQDyDd7/73VnX8V3f9V3JitZpxLtjKCwa495pp53SHoujvv/9708fxsbQxLo+6KCDEpjboyBWrL2RMLRXJT2AwYFS7EIcIVwg1g9NQLgx5fd93/elg6FBCX0QwaJJuetqGoEaZrYxQwFlUGRkJ7WStYY16dlcW+VBPmOY+9NIjPXggw9OWvXjH/940ezyS17ykm7//fdPQj5No//TP/1TOj/uHRdcXEly7Iwzzuje+MY3ZlkTPgPMl1xySfJYAM0iLbaslB/90R/txiIAALRY5XjNfhJ+FmUOXnNmzkRMlleAd6666qoki4s8b/Pmzd0LX/jC7id/8icTQDE4KMIcZXNikixKbn6fgDueEjbAV9oiYUwJ2RwcKJWIcAcJC8GhcQT+gZ+XJOjcb6Y7DRXE4qThgCWLhcWEcsQ/pm0sF0xAnanv2QRefzarVhymBqJYxIBZdaxz+1VqCELEJcXUopQqyFoklGRt7VUMtxCCccay0dziHNYHfiL84uLAklLDdysR3mS1eJ8ocSlNAfDWYg08KHKB73IB87777ts9+clPTs/E26uJ9z3kIQ9JZwpsJX5+5Vd+JSmmIa06YM6TE9aJpBy3Hk8J6Vg3yz/CYwwo+8YzXHdAKXjvExQxNC9IQ8g2//mf/3kSJnGsIJajDzA47rjj0svbBFqrBAEAjMstcWCsSd05v/Ebv1FN8TRlAyjtGwayX7ljNYSYxcP6IigYOYSDNQEknfF73/vetF/+TpjspVilUAZAwvysX2c61JopYOVbFJwz4gouYmEQQoIPLFlxJfZxkuxRlFF5tqRKzuoFHpx9sl+SmPZpkbANud1ll126Rz/60QmsGDLOWvx56BCBZ7F2KVY8Z09++7d/O4XHhPOsA/95F7kQYQAhHWvJbVUWKw/CyDR9TNGRZZzMTjLvbQjtR1tIAhG+3OSAAKWPP2NcAOnzp3/6p10tRLgxK6AClCVcDplhzHviiScmj4BWR/YIk7LgWBfcuLCGxAyB4lFHHZUsSvFD4AR0ueB+bghLhFID4IAYWC86mu0JT3hC4jMWy+///u8ny6h0IgdQcrtDNgCCsEFusu8stPCSgt8nz8NZ+TfrPPzww1NTAcXCYzz//POzrA048hAYSwykd73rXd0pp5ySeO6pT31qsiKdHe/PusRKfT3vvPOyV35UU0fJwuSqnX766cmiI3jiNbnLiTCEeEdoLMKjjg0zsHxrIgwCrACNIDxlk5vsD6sQQ2JkBKApESES2pwV2be6eQPaPl/3utd1r3rVqxJYEoBTTz01xQ/t6xAuZlRXsHLe/OY3LxQvs4dPetKTkoWEty688MIUpytpUYrVK7FhESHrZ+XZsxJkz8RolQcxXgDSZOZa/DRiqI9+9KOTwhTXJJM5eS0UsZBXjGZzrgrjeRC8GmcI2H3f3pVQctUApUPjGhDI6BmeF2taKzl8prz4mbIHB0Vofu7nfi4lS2oZgoowiE4SoQtF5uJEmKQU2RsfZyOWRpEYYAtspoGMvSNUKghYJmJyAHeomCCrFeAQeK4ZYZ93Xp7NMgHa/syjkZEvuY+IsqOUo6pCrJDnkjOL69zsERnjITgPFjmlBxAnwYZVJ1Tm69LSUvIQlALltHqdqcw2PhETx+sK2hEegw28viDKBU7kiutWB5RASoeH0hLMorZL/CS3NYkJ1GpxCwWIme+0JveCVVRLkXkAJcaxZgCFcUvGTgPgMOfb3va2FBZZqT7S9ykdPxP98zH12vfXytwsD3xjDwA3xbYIfcd3fEdSjqwR/6eEVT5JXFgf/AXcWeficTnJ+4qB3njjjSl8w8q3f2KWQl3OpG9V2lP7xPJF9or3kHO6O57hndgXMgksrQFAUixil6xK7+L8YyCL9ec2qqoAShqNG+XQxIsuvvjidCA5J/awJh3GK17xirThDgcz0Jr9ZFQNBGTE4TA20KFFS8XUomUtLErW7CK91H6W8uGus0gCKMUUxZ3XWlfp/FgggE5scpEBF/4Pt826IglVmlhLrCKCb73WwWrKDZQUCgtaDaI9oHS5/oCIMmMc9McJUoQ8LW733Xffnf6N8ZIzFkh5/tqv/VqqxeYpqO6gWHkOQmNKC1nAEq3yGWSCzHLTc7fxjg6UGJzw0LA0mmJhAfbcQCAxwswXgyFAGEEc5PLLL+9qJBYlzSm+p061ZOxY8S/AmywLWon8vLIdzC1bz1ogcGKrQ9yfwpUE2oSFK03gVypCBtaqBbjerHJgPUaPPK9JHSC311p5TZQGwCxBnsdr8sHvsy7XIo8MGOf28Y9/PBkvQ9czTxIvDlAyligSCSSfOFNWLtffEA9dac6d/Gpo2aaBMqwO2oJAqrN705velB0kAY5nKp6NySeybAS4RE3WagiIAxran5tG84tplSLWpExjjOOaR0DIelkBRx555HKHFpCM7PIQIQ3eht5fXoisumSO2PJkvJGbyRLnoonNRbJgNaA/JNmPqJ20FyZW1cZzca8MpWJt5513XgKp3LHcKLo//vjju5e+9KXpXKOeWkkSkDTyUJiAFckNd7birqx0/JArHDUqUAIsCQrWkhhDxAdzE3fCIYhXIc91ANyNmuKS/fY8Gh6QA/QSwevJPupwvRexmCTmrFnSgiIibNxcjC62OoQidE6SRdZG6XHR1NdNCgpQii4wYBmTjgjbGBe+cWeBJReSdSfUU8qaXJRYaeRSTBCvffazn10eGJObgKVSKcoWr8MGa1CJIoEZdbisYfyF14CkpKG5rLlCA6MBJeYVKzEHDwOrxQOSJeKDEkcsNK6Yw9czTehqqpkMolGBAEHHBEMPIBiSMLUwivIbxeZBAIGb/I53vGPQHl0hGnFP1qXnEZxJMCds4ph+RtwryDpyu5LTCve53EAAoFsXUCip+BYhyg5YWqe45Be/+MWiicOIOwJoioUi4UXBhkgeqlawTl4CpazjSMMBgysHoI8GlFwiMSx9neJNaqZy1mj1SYwoin1t6hve8IYUTF9tOdC0ga9Dj8ji8mIGgMAyq0GoAoy8f/zZ+s4+++zkck9eBUrQKCEtjkOSoD6XzJg88TQB/skz4SlIVrA6zjzzzP/Tq16y/Mu6eAXi4rK4BJp1tOiQmFLkPHlcEpysfxn5T45wj7lYvM+sa26dHY9AUorrHVOYrDmHjIwClGILLElDFjBtTHxZdATb0MQVwhiTWXYxGR9uUrRgRgbXOwB5X4P8nFjZkJap8IRYEcDUBZOjX3oeRcY65nmyGJGCc1ZkDDeJydyTVp1yokUy5VtLzs2+TAPi6F4iWJRhWCRRvFwKBIQIAFAU7QN5VlNNIBmkAoVMiJ9ec801Xa3EGiePjC5lfpI6vBYu+roHSgJn9NbTnva0ZNoLwL7lLW8Z7U4aQi2jBxAnawK5HVESEf3L0Z8ePbD9MfjW72dZOEMIIMtMiYR9ioGupXuSWfviQS9+8YtTGIDlxnL0rioHKI+w4pztZJJEWMNe5E7QeeYiLZ0B4ty5UnME+gMfPJ/FQ5myhmsiShlIxhBhcdwaw1FBzpt83nzzzcn1fs5znpPwhME19FStokCJSWS3mcoSKYSHIEntT86FzEnKMbg+rEhrEjeaRsz6mOKMiVggLJE+GIgz+UQQGZANdQcLhgWULDXPsO6S08wRoeZ+qRBgDXl/JT/TlFoUUPs/9ldyTOxZfLIWinXHfNRSZO9YPgAzSoJqa5HlGahwYP2KEwIdvL+0tLQ88rB0T/w8UknBBVfuFbNQeRbGwA05C6EoUHIfgY2Ke8LP0lDS4UBKznuUBGAZyur13cQYBBDlG5IDitLD/UYAq99fKsakYyH6dP190S6ReeT5gtnW4vfbL3tVkrwrwIvaR4pupcuvMG7cha5vX3yydIvgPBqjNMieOUtKFFBSeDXEmyeBkkVJBoQpnCPQ3GmnndJXyZTakoni9qzKG264IXk4lDhrUihvyPGIS6WvMPBCwEdZhD9D/tJMK6gvPiRD279dEQiwIt3R0Rf8W265JZW3WCfAEAOZZyX1Y5dbK1isbnEtz+XOj3F3D8uLVf0zP/MzKbtoz5RizCKZR66Q/a2J+gMXSltGrEhhCxYloPTsoa84GZp4RuSUJ3HEEUckRf3zP//zqb24NrKXyofMkmVRSpgxxv74j/94MHkpBpSsIy8ScT3CN+SLrJa4+5PlIRgacJqHOdl/iqmt1decrZVBnhMzOTFpDOoYi7g36k0BNmFxj7caNi6kvTQVCBiJaeVux9saEmMWw2LdAvNZ2dQchG9Yj8rfdJOweMa6YnhREh5TQof3t2zZ0l177bX/p8WxJoqxg8ccc0x37LHHpmqaww47LE2uGqqyoQhQSkpwOwRcI2sMCMZ0yYBdCcDbWuLq6wGO6xVYk2OuN/aLNWQ9UZpDkKyNhe1cuZW1FVCjqFsE8kqGhi5VmifIzvE973lPMhSEUHLeIbS1ZI2aAsgoT4r3RDl+7GMfS19Lx8dXQwwaBgXjB8BrQlA3y9AYAiyLACXm0GomfoBC4Gq52bBGIlwsbp+aiJUrqzhWKdfWkniVxJJRY2LKuafNTHu+Mjj1nKzJIfrdhyYJVWEmIaiIMwszffnLX646TBBEQVuzUjFWJdyhHNcNUCoEFvuLJAALhPtRW5C/0bZLeK4fUhnDQwCQF110UVcrkce4TG+90gc/+MGUdGKUadTg6QwxrnGpNLPQ5hre4yKqRo0aNRqKeKsK0ZUKsSSHqqYpApTRdSMhARy5kzGgs1GjRo2GJCG9oUuvVg2Uaqoi1rhaioEXyoSGouiUkbG2rlrinjEeChn+Ucu6ULQcKiLe2rPMQf1bOfWMawiohaKMzN7VtGd4K0hdcC3hrI0bNy5fdbEWzMjNZ4vShnsWNOtmDfhs1KhRo/VKi+La/4P8Ro0aNWo0nOst9V5qHNoipOXKyDTlC6aj1+Li6g1/wQtekP6sC6mmYl2F10onlIMYSFILWZO1Id09pds1V6IXvvCFy/dNl7h/e1HC+2QAKQqvJUG6adOmNB3MV4X2SrNqw4ysQCkGohi1FooibBEE66oFKPvZNomrmvYsauLsVU3rUr8XBMRrWlvwlfrfmtbVL+5XM5j75tJFCUBGVK82zNiaOG5zvRs1atSo9lsYG61PiunmMYmnFku+UaMc1CzKRqtjmI0bU9mOqU/idYp7DfJt1GhbpmosSu2NpmX341QC077vMjAXDWl79KltJl4uUn9mPwx41Y5llqdWUC2hcY+x2lR9uSV6r1mRamBd46F3X8JKi1iJC+Ea5ScT2F2pYMqXoSfu2HZ/9tiK2cg0dZj4H//FBHYkJmutZED7IuWdI067NHYRr8lCsp1m3wEGnyCAaJKJLJUpIP6uSb/kiKyx9sVcvUc84hEJIAMozafENPbLuC4uL8URQGnKM6bJVXTsueZjusZD54OeYNPPx7j2dT2fLQVjXqnCZxPs4woRyRhDR0wXmnY1SW6Kq3/d6Y2vrIvn8IWBr1VYaW/M7LQ/QJuRRDmbPEYW8LyhIvYNv5vAJOlnwEkM7si1Z8WB0osboOpjRqBhsHvvvXdK19uEPlAaIkAggakuEjGxK6+8cpsGSu+IEdxRg2l1Xsjs06z2ILKJmKZ/V4x9OuWUU9IYMVo1B8Ng2LiKlgC5vZJlC5wbrUzOj+CzhlhFOo+MA3vRi16UztAMBKPMGAT21aBooFm6m2z33XdPPGZ+rE4fny1bthR5vvKrffbZJxkDrMfozLJXeDxi4QDyAx/4QBpDaGaEvctNS6WZhdtoarLZlBglrl2YRi7V2t7Ifrz61a9O7m28PwY2EJfFoRwk+ljtG4uE4LEGzjvvvKRVDdHNcXGWMABLyDO4Ze5WLmVtrGfC95S9uloDZYFlTNIKhQYYfIABa+rCCy9M9YclCQ8xXIIobPdJbSkElPbo/PPPX757iaVoCPTVV1+dvCr8TylT0qWp+OVihmkqKsYUkyDZ/zONalMmh4XSurko7gfBqKwmaxQSoM3MMTQUVBF0rhipGK0YpHvHgRJQFAN83etel1wNbgYrI1xrAigs4WqGfffdN/1/k8eR2NLQxcfhCbBwDTqpcUAv4rqxuBU877XXXmlf8JZzVPyvmB0f2UdTZljF9irXvU3cRlcOu32UoomKAQrHs8V7+7MBnKdrF8agvgziPSBVivBTXNEi7m2a/llnnZUMA/Wrzqfk3VpFgNKGA57HPOYxyzcIevm4u8QL2xgxSJcDAR8/52e4H9xuf58U9hwDT+NaWB/XsbLkWL6+D7BofQwe7yAmKLlkUrb1DXYvx9JSeh6LgzABR9pTVwPtGves9JnF3cu0L4DnqgthUDLckiG7gbhF9kYowLsToqHG7A9FFIe94z6yyIV0uJBxV45z9DPOLG63pBCBgY6zHFPHuY/AGlD2hzEwAsTbdfkwHMSkhZ4QHuhfgzwW4TcyWIoYAhQEPou75PF96SHL02gpJ0iKacmgCfpLNMQ1sQQYo9ASBJB1wlVkrQFCI90J4ZDz5CYJI9LiQBGTPu5xj0uhADEZawfi1hcxQ1amYLPvA0qj4qwTmA1lWUUcywfTsDZoWMA0i7gmrEygHsNKgQBreEigZOnaGx5BnOU8BcGSs2dxJzkLLud1FniJhQiU8B3LEp/ho1hrXBsL8O2RPaNs7OPQQOkcnYUEGN6yBjyFz90EyrLlqfg+HnIHkXXhTYpa8qzkPUnW5nl4KC5Ei+sUShAFJsYuRCFxScnIY2yzQEk4BGQvu+yypB0uvvjilBkFegDmp3/6pxNjEHAHUfq62gA/LpF4KesjtDiBxizW6ytmOeSQQxLocOcwvwNkiTzzmc9MF1UNFUuKRBfST++OlUVuM2TlXnHFFSkBBNCEC1wORQiHooMOOij97nlXvdpbnxjh5nztq7NmhfrkOmvAd/TRR6db+JwjkBTzonxD6QEiVjlAlb0HoryZoecXAhr7hXe4kkDbvlnTueeem+LI+Mv33vGOdySwdmYmoONN09it9Wd/9meLZb9Zj29+85vTBV34PtZ//vnnF3m+97QGiZowTnhy0diwTQEloQAghx56aIq9vPGNb0wC348tYpZ4cZZk6U1wW5uL3rmpDsSarU8IgCBLihAcAi3jBiBPOOGEBERqOgGR0IDvEUCCOITW8yxMSsisZzVDIewhFx1zeSfW/JAEfIAeK1r8c9aZOXeZcSDJGiBwER+0R6oW3va2tw16UyNQJtTXXHNNshadDasXIFFi/ewxa+n4449Pa3TuFAxlNHRSyhkKAdg3ybaISb71rW9NoBggiXyf16DUyt6KrXLDXZvsZyT3SpDwDYC2n2PShq8qW8qF8uvHTXkkeKf0neiDAiVrCwMKRtPYTGnuBfDpu1x9ISsFkjab0AJxDOgAuIaESKyRyS9ZQ6P5hBYn4FxxgOjnWMusUEzlfeNWybUSzUmIxdisSZxxNfEh+yjEQTAxGOt0CArr0Me6AKV96P+7fSDY7v527uKCLHTMrCbQeoC3d8MbH/nIR5IVN1RhsGcdeOCB6bkAyO+/6aablifp9y0ye0zxUXbOVGeR9xqaD+2/jLF3t0fCDrwS908Dxcnn+TvPANirCAH4FA3XfYy7z8egjRs3pv0SW6aUyZorZ/vnA1PE33lczrdUDHVpyE12uDK2rAkAw5LEHCyJsXuBCTNB4mZjPgdioonSB+skMISsf9tc1C2y7MJ9YrEBRx8M76AI3FrJ2ggw4fB7lUOsduKKnx9iLX2yB5SK+B9ws66wdAmV2Kgwi+w7JUT5BEBKkFA+FA3eIACsZkLg54YASoAnWcJDsB4WpAQYsJy08p2leKG9JpRib4Rt6LhpxPeEnaIMiLCzcnkts+ojgamaQO8gvAMo+p1q2zJt2rQp8T78EAu3h86FARE8bU/wkr2NpB3Pi4Egpp8zfLc05IvKcIv7eDmLP/nkk5NwjR1fQATdAeg6CEuSCX/ppZcmYbFGh4D8GZM6OJaQpMpRRx2VQNIhxftwoVhYQ3TCiJcSeM9gsQGYGkZT2RNXf0p6AT/vGwQkxWox98EHH5wYGvhgXmU4N9xwQ7KSMLDgPL6QaPE19nqtJBEn+A8AWY/CD/Zvsq0SfwIdihJQ21vnap+HJkoU71AKgJKRALRdVztPkeEtSlvoBBhE9ndsQyM32SchCk0T+IoXSikzGiS+7Is94bHYW7FTStx5C7H4eSEX+5sDbwYFyqhZE+8hLDV1bEwynK8Y2sazhATOI+AO+BwaywNwqeV0eOFi+78UwFVXXZUUwhCazN7JjBJgv7eWfnZWmliu8wU+/cwwcAQ8kdEl4IYni7VNFrzbU4w/NBMDk2gDpPwI1mRs17nLIB955JEpYULACJ8OmBwEkFk7eMqz7YXKDgC+mvuxQ1l7x1A42yrtuOOOyaMSa5bMueCCC5Ih0ielQ6GE9t9//1QOZ5/VWwJKiTzymMPAGBQoxVYwCeuNFYIxvSwBGXugKBdZ3ZoPcASKrDhWb7jSAaQAEZP6+B6rMjJvhJBb99rXvja1Tw1VGuT3E3TuPyEu3b42i+wLJREXRcVaxSS5u6wm1pIsruJg65+1J/aPBbfaRNVayNqdt7WGFcJNA16EKwcFb0VSQgcTIbdPq1EUvBcxVXuMf0sDZcmC889//vOpgkQCWBxylqFA0QDC66+/PtWg8lRgDU9FEvbGG29MeRGYU22MMgKwETuC+qwNwMJl8/LcLhYaZg2BkhhhNvtermyWDfYM1hprMYrLrRMoWYufAfjeI8paUIAE11MgmcYTfxsqCWDPIvkiLmMPaghXxLlGZ0tY1PZF3SmLiQIUunAVcdS/TqPoVY+9Hiou6HfhK2ACWEy/UYPYB0r/Fo0ECEgOVakwjewPYyH2i2dFqa7WfcYT9mylNt+c5IxKhX/uvvvuJP8+84jCiKQrWcEDFK++eclM+wVExXsHawbpBiRCY5ExIYVpzNWJ4mduEcblvvl40UgC0QAYGFPJ+A5N4VKrLRQfE9uKNkWHYy2EXKzEZkcwPRrxWUoSP7fddlsqJxmyfY/FyhJfa1lG1ILOq3Xcmqx3CKqvnqE4WljCeQFK7s+85AbryB4TiqHWxxIBQiw4gA4ogXh/TzzPxxlbM0tpJVBfK+EbBkEQ/lpt0fYYwLge6a677kpJMmfKKCPXBmvAIv8WcctqgNJiuBcKk1ljfcEiVP1m+2lEuwNSrrE6spwWFatGHM1nklgC4iRRhwgkMbqAsRKiaaUdayUZY4kmgLm1SgIYqfEE8Na3mljYPArgjY8z9RwWD6UjZrRSUTClaU8x8C233DIoQEU8VAySCyaeF/MKYz2SSpJLYtHKk/wfXk4uwkMUcNDW8MvknjeaTTGmTk2smQeSiyeeeGIyetSlDmF4DQaUmP/MM89MFtt+++2X+lcJEtd2kZo+mj9cYdqBizw5ECM3cdG0vll7lHUARu8l9lGTS9wn1hqLSowY2IoL64bKRfZB0T4mBPDKfnRbKSSfVqJ0wAEHJLeIFS7Zs1JL5mqJQuCNqG3lvYhZsRq5Z6xGSRTVGM4VSEbp1XqYeuTdxOvGiE+uR/roRz+aWoopRfyvUF98uCqgDEtNsF4bFFeMlRGj1AT/1SJGl8KsrHQIfenuANkzFsnznve8BJLWA6iBtrKOWkGSZSdbyFryZ66w+KmE0Fop2jVZjP1kThRHuwJC8b5zdZ2rmKVYZZRbqadVLiaUAbwoUTWCQ081YvXjPe8sjhwhk0gKCO1Yo5AB8BT2qSVZNkn4LjwDgP6+970vrXlID2EWCU05OwYL44bMmkb17oqu512JogyLhyE+yYsZqvFi8BZGsQFgGfVpFo1xgZ+kDaCMnk5xG/8eRcjcOAJJqPqCmZsAgXiqzGi0TNHgQDJcu9yhgJgMtFqSuGBJASNrpFUBxhAWU8RnozZN3M1eYT7fY7naK5ZiNBlQcLwCoAm8WZsUJX7gdosZ5bKOWLOTyQeKV1zcmng9MUMzJ/AI1YibCjl5d2EBJVa+N4/sJxmxj4BSTLzUUAjn4nycedSdks31RJQhBe/cYdFQ55wdjQgIBiHAERdyELKOmEDsT1wymGGMG/24a9yyyJih6NpRapBbm1MYrFdg2c8uT6NIpkhcAC31lwBJDFjmjwUC3IfYQ+shOCwawAiUVQtw7wkyK9sZvuY1r+le//rXJ9fb9wDEK17xijR/0f9RDsRNB5SlXUgJJAIfveam4+cYp9YniT9xUedq35yR2P0864ZxIL5JIdlHilqrXgnybHwV057GoA1fbYclg1tjKPk/jDE8ypPFt0MlXYsNvaPFCTAgxDCyU7QVS4Q1F6U4MfWlpECxegh0gCQmNRwAUObIwE8SRUEoADahVsbCZZy2B/7dz3F1xQe53axxPwuYKKMhSzq42H6vujVegQndumAUaztTQGBNwECc0s8AcOeoXc/QYfuYq2ZxHj3/+c9PwM3KU62gIDnXnUJBLBnPOf3009MnvBTKQuhhFm+zwsXVWKB4gpFRauanZ/NMVIKMRTvssEM6L7xPQax29iw50AAhHESRv/SlL02hmHUFlJDddQZil4BRJpSrTeiipAVAco2MmirR7B41gjpMxLH8nRXFKpMEKFUUHWQt4kImLrHkplmyYleAMoCd0AFzACk2OXRhvzU4F2Dpw+o27EHBfayPJneWPtYkVIHRL7nkkuQ+Dh2TXM3tfSoKxKpY7EIFpRSw58Q8UTFHIGDflMlNW4MuMCAl/OMMzR+QlChF1ibpNpY1uXnz5hRHNgRDTFSsdCWgtJ+MLBYkK9j/ZUniQYbCSSedlEByqLrsYkBJqDArlyTKRaLnl0vETFYPFfe95HZ3CRINJBsbU7BtKnAMt6lEAD1IeAJYExZZWmuZ5j6LvQDLuJKB1WbfMBflkmPNfifrxtxE/fG6J6yBReuZ/Riuc8SgQCKuWxgjCRZAKT5pzyge/FdqLVF7y9WnAOPKjujsimn/9lE5E+PB5C2CT0njh1JDe8VvnaVzDdJuyRMo5frf6173SnhAJlm2zi9qr+1TeClxZTLcYHkLVURegzGmIQTv2cMhE7DFgNKCLRzjAASuWtzRyxLiltP4pbQo4SFIL3/5y9OmR5ZbIoTVW/qaA8+NVkplPvYFA0TdYhBw5EYStKg7NRgg982UcZ8J8BM2oVysgzD3B0vo9waWgGlMsnfW6GxjgnjpAS34CV87S2uxb8qUnCeLSRKPHHDNVVxQkBQLcCppiYvjsuiANO9BDJdSBJZDtwLO6wJS3mM9qiXsDUOAxSguTibtnQqVuEom7tIR2iFD9i7uQBqSil7MgUlthp5gBd0YgxYzQZnglwQn2gjjxvW40YdsKnvO6wpmUQwOdtAUSUyemSRutgQKQRPzKjkmnxBbow/KWau5VuoXxQNI5SJj3N5HYUhysfpNKze9SuLN+QFRsXDWkHUCVuevjKrU9QtBrDVrVbRtIEXuOO40vhZusAYgycIEmCxxMfsYO8iDMdPTmVLKkc8AlDm9l9FuMIr7eDHIGO4ZF9fE8iAWmQNguo9FABpDiLfMypBi6LhgbOjZk9sSETSND9w5bhjlMhaxfPC7ITEUoNpE/CeOZuI5oqQlgMgEa6hk5YfnqlIweBkPjmEoIBY2j5KyiOqPGIxtP6JLKf4crbAlKmVGA0ovNuYtftF6h7iQ7qmmocbugLAvYyQ/tjWKovhalIrnqxIQfmIZvfe9702xeq6mteJB3oFETmmjwf5YV+nrFaZR9OXXRuPfiTkScXPE18SOaHtlGyVKgRqVE35VFrpMANMiU2lKJTRLt+Y2Wjttt0DJ1ZFhV4cocM3lqbWtrdHWgZLssvpA8cAhLzNrtP3RdguUOoV8zjjjjCp7uBut3dV1wZi4czvfRmulDfcsyEWnnXZa+srqqimGJs4T99jUNBFGMiYGxYo71RD/CRKbVXLBPa3lyon+1QdoqEvbhiJn6UzF1WtynfE+GUDRp10DbdiwYflmALw/9g0H0zCjj2uDW5TRgVEbORDFxTWSgwlmrokIfq17FoBZYza91j0DTDXSfe9732VgWq+0aqBkUdamHaKoePLmvbFBKBg35xUXWwtCBD6mxdRC1hQAmePq3bWQswyLstRd0osQAIorbXlUtViUKC7kqxUzsgKlDLEkSC1k9L8KfsH7K6+8shpGUb+nNxrpnBEPrYVc76oTA/PqK6+FFNobioBMGqqpCsHAD4CkUsLMzVoI75MB5IaBWgBp06ZN3QknnJDqICXTcneOrYYU/ettXw2VG/rYqFGjRuuUGlA2atSo0RzabsuDGjUaKyRjlJ74XdztY7SgWHHJ1kETjdSYmrXg+eL7NcXRa6NmUVZCMWE6MoTR49po2yKTgg499NDuwgsvTBfWiSuaV1k6Y220mgvZrMH1vlHK1mg6NYuyApKBIyimgz/1qU9dTmboFio9RaZRPqL4jj322DRvkUUp8Whgxv7775++lpo/GYkzU/KbMl6MGlCOSJiU9Uizu/eGGxT1lgDT4FLDOtxqWUOv8mpdO2PFuHUy/qXmGtZKztkcRQNyZc9N7zbf03AM3sRQVxYsQjEgVymWSfXmwOa+R2hokrXeY4890qVt9jXkJoA/pgq5k4sMGce2LoCS1eRwjJkysBezYBC1X5hECUFN9WmlhMdINWO3HLoBr0FmKTp8LpGfM8HcoN6hLksqURNpliCANyFnTDIN3n5SRHgQcCs9KlVKI4ziuUqfnLF4pD507ZVGnFlfybIeMVIyqISH7Ok0qqlmdRq5EcEMWTMqrd+1D6xiljksIRvA3gQmuBIASq6cdfVA6TAMx/VyhvWKh9AGvueFaFZ1hnGn8NABZZaNZ/nadzNoG4H0aMf0Z2vwdwNz/V1xcc6RT55jeDGwpEg8D7DEbYzW7N8xhXWZ02e4g5+puX/ZmRvRb+CIkMK0AcSlwFrCBEiJDao3NE3IwGN7yWLXNgkkYlJ2DqLo1K26k4byA9R43lULY4z1Y6zgKa4/cKlxrFlfwcV10oDP+VF4lA+ZoHTsJ6sR6Atf4LcDDzwwKQMyhBfXStmBkqAcc8wx6b4QgoMcTozlx8RHHHFEAkgvP/TgXDfwubDITYtx0yMiGHElLWLZupKCO0QL0bImeee80xuDukMliuRpRRebmWxkrS7yckWAPTQFnlXJVfMZc9IREF9pT7h2gCnuAB+DKGEWB4Fx9i6iCoEhQM7YhHZWCuGisHN1KbEixZ9ZPvbNpHWfsWafAhz7ge/wfK3Z7oc97GHd4Ycfnm7+pPTi8j9nZSyiiegmxzMe+mDPMNIYI+yjOcDPVAuUGJWgXH311YkpMam7VFzHGW62g2JKX3XVVSl2AyjcdyE+Rzswl9c6gMDvp8UnBRYQ2VBXWyIMDDxZlWFRWo8rGmh/o+YBvDKKoVolaTtdApjAfR8SOIQ3BOj2229P95acffbZCXysFZPrxjjxxBOLu0vcHLdmWodrWOPe6pXImZccvEGYnDeryfRw2V3n2VeS+NI7UER+Fli4l9wVDDmUopmngAk5T1dAjBWOYJiQNWtiBFAStVqUu+66a/ec5zwnnV/IAk/AnFFyGh7fZDeeMIafA6RDXSiXDSgxoDt2udtMaJbipZdemtBdHMFLegGWEXCIS7O4mjKD/nzNNdck8FjLQboGgGAAXm5YCC3gBFSYZvKGw7A84lIj1pH1MPUdkkNj+Q3lenuO/WHd9i1FsRXPYk1K+LgMTQjDnSL+zV0srOJSbjjQcZ5CBYBoVj+294q7iPxMKevXuVHKvAg3GtorvId/CJY/A1JK3FlTUADUvjr3eZby1hAe8/u5i343jwU4jdVq6+I6+0IGXUy3klVrP1ncO361/54s4FWGDgWYm+/E4/E5vgsL0bPnXZUBWygBZz6U1b6UCySZzWJCLCEvCFj0iE8uPO7JwMSEz//B6FzhELa1EDPd7+dGTwIlRuhnG7lGYkkYm9BYe9wfjAicNVqr3tW1amLPCEuX9Tp5j7G9wShcbe4b65PbKLBN02J0JUSlMuIY1rNZJYAG4EwjeyaWBERZ36USUKztpzzlKSmJhIcIMsUWYR5xuQBEa/eVUPmZXG43UIp72O0HfhuzH5th4HzwDNCZRWTDuuUUHvzgBy+PTQOUZOC6667LOgqPjInbOydyYK2s8EV4PcquhqSlXMzBrdlnn32SRfGmN70pCfssdCdMzGyZqrjwizs0hFvgmYL2PtPIdPM+UIprCRwjVgALhYA5OEzi3wkjgVvr/D9WtFADZoyEAgpLx++2fiAvAQHAASYAV6TsgiqWQSmg9HzvD1wosllnI9zhncIiyD2/kRATKvFIygT/2Tf7oqDa/rHs3OHeH99mfwk7t5vSyWHlCVVQtJS+9YydYY474cmcfZllFQJ21qdLxx70oAclXg35dWkb4HLHFPnOQfYLIJNBOCA0VbLOtAhQ0jwsSgwstkeo5rVnxVWUQbRI6cvHMLIgMTMfASsCCJyOPvrolJAC6GJ1L3vZy1LIYC0lTWHNBDAjYPxTP/VTSeBpUAwSZG0sOQyMtKCxDkoQoMG0BI3Ss65Zii9uz4sbI3O7aM5EPWDEeykPlraJP0IbvBSABeT7ZG32VDgoVx0h0CYPwJHHgLfWQ00sY4ChAyTRJZdckvZJWOU1r3lNd/LJJ6dbGymYHETRCp/gIx5h6eusiwAly4x7hm644YaF7ithtflgdKQIdtIVLUUh2JibmyTWIXsmPodxgKNQwlrNewcvWURg7Re3lhBJ1KifBKD77rvv8s9j0v4cPYwUayCAOV1cICNeBQSteZb1FfFJ7hohyn0/NFeSUD/hCU9IlhIrP6oXJBIpEm7kZHmYvaKIzjzzzGxWEWJNRkzQ1RRjCvtq9hRQCWUwWM4444yU6GLweBfWJN7MOYwaDvjgM17J2HWegwOlmFvUrSEgOe/qCHEIMSXZXBrEpujkqKEAPa7V5e6y9gibr6xMa10L44fr/f73vz+FBoAk5vC7Iz6zErEolQ8BMPvF0ov9Hrr0RGwU6CglEWebZSUCVIAPnLxTzkEPzgKfURhhWQNq+yEsQfHaB4F9a2Z5AnD7GkXf1pgTvOyDNdkHyk/8NJKG5CQSSBSKNeEHaxrz2mRhHecY5XO33nprAiv7BDh98KdQDMNhrcXc08jvtT/2hdU/dgnT4EAJPAg6YUEsspW0AUZhgT72sY9N2tffuUHiETXczWM9wEt/LuYmZCwXArfWGkHCQzsbjMAFx5iYgyXNvY9YJWsIEai+VUSjcykpmMjwyfKLPdnDoQDAM8WPKQvCTDDi/e1HZJGRfXKO1u49cpaeeC5QZPn0gQlASkRQtFFyppSJax7xtmh0KHH/TcRQnRM3nMUNiKyTbETxNGXH6pR8Yr3lAPAIhcQQlklybpQvxSN8ce21185M+oSXMTRQ2i+/lyywxCPhSu6CompmXbvekVVc5OdYZzp1xDRpXNqKmc8tqqFdz/q4Gy95yUsSUGJsjKz9bK3rc9B+x4c+9KHl7/n78ccfn9xJjELp9GOYk8xhD6NnnIsEAFiob3jDGwbpHw5FIctOwAEOy8x+EHoAQKgkvfrr8R7WlHPogr0gSNYUoR4C5IwAj/IyIRPggMeUneFLgKSTQ51dboqeY1alc+yfpXWJ4bPWATxlxO2lgK6//vostZYUKOsM6OCtfkkUXqdIACXwk6xRnjaNcoIUnmIc8OYouiix6j/bGQ5Z/lMcKL0cLR3MizFsOobuk5cncIrLxeQiG+nlZXhrGWnPipI8iaJ1jK1DZqhC1kkiPAprWTs0PuaNO1GsJeI2UebElWMp2UfAxB2XvABeF1xwQQLLtcR3CBIh93spMrErCRJrw8QsHx+dJoigs4Yx+0knnZSA67bbbssy9ME+ADvlZ9Zl77iIvBEdT3gJMEpKSJDZO4r48ssv7z7wgQ8U9Vis1bNZt4qhWf14CBhRMgGiLPLjjjsurVsc3M8PSRTHox71qMQzzodCDT4GnBoJnLfETT+RGBRgSr6B6NDrQ9YXFiXlHN1zQdbLynX2chli0rljv4MDJWYl5ATllFNO6Z73vOclQBRnwBw22UFw3WgJMaawPLhEAACT19AtELVkyk4IWQgiBsmpUWn7KCfhjkeWVOaPAHl2xP6AqrgkZjF0wf5ieDE64KBYfS2xXs/y3sDQHnC9dZZw8QkLy7EPxIQ8Av0xGENxcq7pOMDO+7OSon4OIEVMS2jCXuy5557p7xGXLDW+Th+yM+Gm2jOlSDGQg3DbT6DEmPAePCuKJgb7Dk2ewxrDSxQvwCSv1oO3yC1ZxDOTYYnddtstWebKdoA4Oc1h0UX7MA+OsotOODhhT2BHTGESzgCk6jpzJnwGB0qCxepCrBwvJCMJcHzfS/oZIORQMHZsdr/ubuxLwqJmkqDRcOGyyS7L+uYCSO6sZ1IuwIWgRTH0NEYIq47gs/h07WBo/18tKxcOs21tMDxaO1mFAJjrplyD60q4+udk/YAdAFB4LL1csbYgz/du0yornKG9FMOkeACoRgE/mzsbHyQUYg9Y5BQeC2gyZON8w0Px89ZqL/sxuaEoknFi4hSIkEqEAABPJLucY78aYOedd07KhhIkt6ziXNUC9gt/4W3nhOfsDS8UUKrxZGDhc0AJ8HW2eadcvJYlRukwaASun3pDYEkT+RA6WUjxl2jbU/RNixEuY6dqSOIIBXDnMBKGjfYz3UU5snwIE2Dec845JzEmoebuA6mVNDdmEnjXR4+ZKChWjGQBoIhEwdYS5uMdWAuhmhWbDSvIGRvswbpk+WL8se55txeRNMF34pYlEjhB9soe4CHW4qwEYEyyAlgR781FFBjXVqUJLwTwCJ04J64/wuMsOcpmaWlpuVTNvzNmeBW5qlIAo/bcWedqLwElj4kRRk717gsj5ALvpZwMoqyAaygDDnjCOiHUwJRrxmqLaw8wMlCogQ4++ODuuc99borHIcxx8803/3/xkiHJPnBZxWYwA5BmDblelns2z82x5xdddFGyLsV9We2uHaCA1jo4dxHlhWEJoOc591ndUCXIu7OmgQ4LiHXi+gXvUTJbCiQpF1Ya0MHz/jzpMTlzFhtgn5aNHpLwEmWBX0zuAn5CJH0C6s7Tep/xjGekdbE4WaO651jlY5QwRSkVELceXpS9ZJXnXE/2MWusEdajw48MoBf1lXaNS9J9HF4ut3ZRwsxcVkMohAticAfgIfg5QwIOn3Ixkuuggw5Kbrg9Wk0ZEkvSngZx5SYTaTmn9tg/Wp1nMBaJgQs9OD8Kh5ITh1NnWjr2zQuJmYise91CkpWMhViLcxYr1BoYcUmAnnOYiD0R1xOzlewCihoq+oYCRe1cd9hhhxSP1IqsokI8fKw6T3FK8qErSEmh8xWe4n3lrJJZKqVVpxUee2klEQEEGGfMOYssD7Gk/tBP7gV3W5ggdwE8a5v7AyyFKSLj7ZoIcStAzSWaZAjrFt6wl6wSMdUoy6GUSoFDeAaUYwlwngWSBnLwBsSueC6ytwSJ1VF64DHlz4IDhqxbFRS+x8Ni5Tpj1pqZmQDAep2z/5OzM80Z4SXyhu8YKPiGgUDhiUfzDMT97rjjjhRblYySbS5ZthfWN4zgfeJxDQZCAb4nxKOCgRe1ri3KlTYAI9MKXjjAdKwkTsS0Ii4JfIAMl/Utb3lLKjkpoUU9Q+ufYLY1iK9hXsIPsDHqZMyPEHKPojc3CtStn3VXKnExOT1+DMJTCt5dnMXaAEaUnBjcWFPhgQ3+At4E3TlFJQhjgSvr3Cg0gCXEU6qFl0KTmPGRzQZAeI5MAkrlS3feeWeyJP29hNKNkjSfyA/ElHP1xRKWAN7+CYVZW+6rfkcDSlpL7JLLwRJxGBhqrCJzoOQgxPiin5omVaulHKik4NuDs846K4HcUUcdlaxJQKg+MsIXkxRxrX6fugy9i8nGvrOmJIXLHcMcuL7cy1xj1Ba13iRpJDYNEwbkEin9i7AkPyVILrvssuTajmEwcPc1P5BLhky0g+7y1e4b8sAFz00UPesWcMvCs27NQOXt8bCAomHfuphYlCUMgdGAkobw4tFYzz1yEGNYIhhWQFuPa4CkoLVkQHR2lCYu2BVXXJGK2ykTQKnuFGiymmYRZYN5xG2MGGOVlBrEwLV0rsqGrDXXZJlZxFoTu+KaReKQNVnDDZDRs+xaA7XFBJ/lxsUOixegKyEay6uSbT711FPTn61PWIA83nHHHSmmmbMahQyqm+Q9ieMyFqKVkVXJwhXDZ2mbFAUvWJXrtjNnUaKtgKTNwNBM/BxV/ouQeKRp7EoNQstjXJp9zFIlYMnlF7NkHcqMCqz7sH5ZIVGPyi0Cij5xXQW3s+TUFWth2WLe3K7QrDZLYGl/oj00CrtrIAqX56R/Wsw0euedVVx7MualcQAavws1sSyBE9f8S1/6UrIkc+YPKA3yJ2QCHPGt9ThXMqAaBmALTdjD0tULo1qUNgRhEFo/V33iSkSwWSDqFwFmDHPlBimBGPu2Q0Jub3y4JBQMQKL9uXOUje8Ttpi41B+yWpKcYwh9CRctiOXDXVStEKPggKQ4ryD/2JNn+gR4xqwIWIRYvj4l6QEPeMDyrFcgGK2xFInQEWMBSJbqqKoGKGWworCWgGPs0qAUCSWF0dExRJNxW7mNY05Unkb2yIf1SOvXRgqZgTT3nzVeiigOQX78FHdks6wVyQuhlLRuG20dAUUeJaAki7q/uNlCEzV4BKMBJVNbhhk4xSzG0sQSs4a4CwQAiX2YjMMiGtuaXE9krwTYYxpNyb2LLDuA1IPMTfMZw0NptHXEaowxc6g0D1ULlHGPjs0AULlG8a9ErEcHxLzXFaROTIsbAathKMd6o7GYm5JVxK1UBFiWvEeo0XBUGzhWAZQstpJxrFlAyb3WIxolHOoUxx4732h1xJoc8r71Ro3WDJSSCforayHrQVy+6PtcDfl5WcggWbYh3q9/n4gYWk17JswQMdqa1hVzN+PPNa0tBlHbu5rWFXfQI+U1cefU2LRp06blCpJaMWM1tOGeBW3d0047bWvW1KhRo0bV0qK4Nv++hkaNGjXazmnVrrci59I1ViuRdjBdIOJUej7HHvgbpAhcyQpS9rDIlb2lSAeSTL8C4v59PWOTrqO4U0bvcVx3UQPFVRvKWJSt1EJqf2Pqj1kAtdSMbtq0KXWTCVlImJYsF1sUM7ICpQxxTUCpDdJLA0jrqgUo1WcGUMqo17Rn6g0BpQRWTetSWxtASSFrm6uFdI0ASsXQNe2ZmGQApYLsWu6a2rRpU2q9jauBa9ozhe2rBcrmejdq1KjRHGpA2ahRo0ZzqAFlo0aNGtVacF4ria2IXzzykY9MU1M04ffjnnqtxVxaUXqj1ZCJRjF13eQg/fBjDTduVDFQCoQriJXJNKSgxlalmKz8xCc+MTG13lNB6D5D6+SJ+XzbOxleEHdpj51EMAXK9aXzbjn0lZKTyCo5+V3y7Pjjj08JKxOedKWNdV3GeqINGzak5Au8cH74bAwFUwwo3WZ48sknd6effnqazjPm3Tjzpgldeumly50Yk8QS0KOO6bd3MgRDTzULyYDjsUi3DGEyaSYu5woKhcw7uOWWW1Ifv8lCWlVNiCo9vMNHlvrTn/50GgPXaP7gmne+853JuPLV1dZjjForApQ0PVfWiH4zC2twOeJqWPcDszCUoyxiFXHLDz/88HTviWnapg1tbz3GlIjLzOwF68hA4VkTq42D40UYm5V7NBzrdhbwAdADDjgg8Z6fA6quyTBMowRYemYM5nWFMN4bAyjNgXUNxV577ZUusPO1T9bn7Bgz9kjN6Ic//OFubDKQOcYMGlyzzQGlTVcb5+MQWCB9oPTvQIfGV29Yamx/jHlT66VI15ULLAyCbzS/OzviWohJgNVfy4Wi7VZzley2Qiw4ZwZ8WGizJvXYq82bN6dC8pwF9zGZm2XrsjohAXMp+4IP3IV/gtyXfsghh6RxbM48d/2tPTICTg+79fkKLEuHLJ7//OenhgP3dpv96BOKIkbk+QpAWeluhrzzzjuTMTCGgWOKl6uGyZs9o3zHoOxA6WIgroYYjTFm0+4EwbQ0BebODZSATZG6ieaugRWPFKsi8Ex6H9PN/Yy1K+iN5n4Ud5D7PgvZ/TRrnXsYd5P094VgA2ShAAMFgM20cIDrHnxKCRzwY5W4U4WicG6U3CTFwA3Kxv9Z6Z6ftVLESQGRs2Qp8WBihJ+zso4YBoLwIzAggEA293Bf63O9iAuz7B2hd7Ome3JKEoOFMsFP4rQSk/bI/sRQksgn+PAWdt9992RZjnHxn7M11cv54KOxhn5kB0qXdrEqTJo+44wzkvsxSVxfjO1rbjOfFueCudHQhfSRwKFhCQ13iJvItSTsGAW40mxhmQRgveAFL0hWsI6ItVgkGJICYdl6hucBSdrcGh//+MenzpC+oMefxQl9hABKEGHiBmkDw7TOc9osUUwdE8dzX0MMEMW8xa8k3yTi8BMgoJwBNoAC8JN7SGEKu+QGSmfLU7EGIQsygQ9LAyUQxGPWAyS1FrLYrIkXhbf92TqdHXnZZ599UgJzrBtSGRHO2NrG8uCWSvSiuhjrE5/4xMzhvHpBWQHTYl1DkoN3PYBb8LgcYSn6CiSBo2tOgTWwJHDcM62I4iKYG6ASOsSFAmQ0HiHYWjriiCNSgoiLQ6iFA9xGJzyAYYGoyc80P+HH7K6xRX6GlVACKCkUis8teCwyF8+/613vSn3Z04CSJ+H/eC/KJDd5JqHmKVBs1nXssccm1xof7r///mmvWU/hiptmz9IDtCWuHHCnD0scLz372c/ujjnmmKIJJTd7sqS9q97waPsNg8E+XXDBBQk0KUIg9alPfWq06xicI9kC7AHkDK5tBighPzByu6HA8LymeFYVkOISGDwwtAABQk36wM+zJt1YrnbcBy0rikEIGrcWSIU1wLpzUxzABLySQQ5wa4DSOl7+8pcn0CE0++23X1oDIaK9gTWAcTc1Ycc0hIqgiw/6Pyw7jC4+h6FzCt2ee+6Z4rpxKZybMym5ac9krRFIwqcMJnfCyxkfeOCB3dOe9rRkETnDc845J4VSWIsUtQHNFDIAt394tGSsGRhRungsQjiliQvNOozkUkzyt3/OFQg5Nx6NcI7Q0o033jjaxHjrFN8Oi9+68KD3KBkzzQaUBISLBgQICRd1FoWpDyDj65AEfAEK0Mak02J93H6H0b8CgmsifgUsCRxgxNzeyd3Dfo/fLavPJV0NMxFUrrWpNFE2QmsDFUwAgNRqRi1nn2h6yScM7fZBIC+OxIXMxTyeCSgBjfdmhUl+zZpSb5/EJf0sTyJnMsczwkWkaJwlC9u+iVkSNhaj/bz99tsTDzivsCpZUepjS0zeAT5jNisAx8nwlzAFGcRDyviixMq5MRY+N/LdQ85FWR5Zc26UDd4rCZRZWhgJCcaF/MCDVaRubNrPEUAC7yugiGzlkMRFBpSC6P3Jy/2LqQCSwwCOQUATUAJ5f2blcSO5m0GAHVAJG6zGQpAoYomySIGdDKNnACCuDxdJ3d+0qSv2iZXJarI+ws5KymkVWaN4HuUXdz8Do/5+BdkHFmXEAykZIJ6LwhpyXa3QhfMCiAChHxt11sbK2bd+sTmgcB65yfPx2NjF+X0id4BHyduLX/zi5ZAUxeLMVAXUQDxSyoxFKV5fOlaZxaKM+sQjjzyyu/zyy5PbQ7imHRIB3HvvvZObaZ6keObQZFNX2lhWnMvOPH/eFbWskH5CAMABKZYMZbCo60trs2ww41vf+taURPL/FxUigu4D5K0HYOcigiNRwurwTOANcGatlStLedgba/N/ciYCCA+X27OAoASOmO80kjxxrS7lKSaNhDomC9VzdplMKzsbg/AyJXHYYYcl/u2TPRSDrukO8ntG7OZbylWMLCGBFPTOcrtYUNxy8bcQ+hwkO33QQQfN/HdMAaAXcTGUVpx66qlrXpO4LUv7oosuSq4FK3G17w8UlJgIBeQgZ8lSe/WrX51ieSxXVqQOKyVBszLZUWtKYeaOxbEgZdfFjT2HtS8ssVL4RpaX5RRAWZJY5c6L4qbc8CVQH6OdMUpvfOxj1CjaR96OM5dUvOmmmxKvkREhoTFIJxXP05nhQ8qG3JSaPzs4ULISxTqYx9wMcY5ZpRdRKCxgnNMdsamsnEmy0cBK+x0GWGTTvd8QNYHiVGuNVbFGAVYuoPSufrda07iQifXFAsGw6k2dW9QxhrATOKGOAMiclgArVwE1S034hAVE8Fc6S/xIMUVxdSnyPCEL+ymEIeZG+Lm3Y/V9c7Evvvji5fpKcWjnJxZIboQk7n3veyc5fc973pO8rjHAkhfDMxUj5w3wxvBcqVbowYGSpgy3kisZrkYIUh8cMA6mBagsEBq2f0MaizPX/doEBYgrUmZhTKvvDKvK+qNukJvdB13vxBJlwZR2DZQL5XRpI9khWRUJMLFYSTGASaC8vzPiFQAq7q+wi6SA/fD9XEkSvGYdhMcZsToIsWcuQmNknfEbXicj9lJCL0dcflHixQh78ezIAzkgs2SRQWCde+yxR9or+yr+OwZQSqhKHAJs3gBr19/XLVACQpupwFwJzcEHH5zifkBTbGiyi8MBMOuV7rACgFYkWAABQcwBQH4vLXX22WdPdXkJYQCFusnrrrsuWQCYmmuJrIuWU3fp3UoDpfVF21kOF8TvjYQWIIpieAIkUSd5ggIo9QZz4Xwi5sdVX6niYS3kbKJQGsnQ1t53zwuQEBQyoUx8HRMogygZvGzwhHCL+mDWrr7whz/84YnnGQvkV6gglwGzEq4IzzGs8CGPxj5OSyauC6BkqQn8y95yMTAyhth3332TOd/PNoaQ0160hBpFMRuHJlmgNpCg5SinCHCZFRdUtEzbA3uC2LeqYu20mTIFQDnGXT0YF1OzKtU0Dr0GliDhUaT95Cc/OSkMrngks1gerGzg6fuyp31321mrwct1X4rnioPmCj1sbxTGCTAkg9dff33yDE877bQkD6x3ylEThnBVabC0NkYVuZQEnhZOW1dZb8AG6VX+c6W9EO3vA2wAJrcDgxMshdP+TVJFWYzN8P+jyHqtFAmFvsUX04NoyyD1dGKrvgJGlhM3yfrDeuvTJZdckmI2YqxjEJC0lyw+CbMcYA2EKQLWojOyb7EPqhUoQhakvWUtcbsxsn31f91AmbMOb9rZrpasr6aSnbEp6nqjdXC33pxPhhA3eIwMtDIznqoKh9Jhk2wF5wCubxZ7MUJmw7lxETAGQILIgsYYVocL92lIoRfL4P5FRwkC4JGYCALerEda1L/HeicJAzk01qS43Fg9sNbIFbEWyiUH8zoHmnza1bGebc8onNhnPfS+10/w5O6jDuKZrGZogjUCSNnc2l32MUj4REx+8+bNaV/lErZs2ZLCaGN4UPgoZM3aSg7IKDa4lxBHppdLFrFKQEkIWY+EinU09CHQQoShD5TAmlutw2ZRClc9XF1B7bFau2JAAGUkyzuv/jMHqRiY7OEW23LG9sraSnZPsG6jG2glHqIAffwMBSq8U3IYLAUXScuI+67VKh6S7B8vildA8e361QnjalCVCpUahbgSsXL7Y/O2m8vFWGc+0wrT10piZOoWV8uI/Z+PUfTa4BSHG0ohfjfGpPYYisuCsh6gXQtRQD7AUndMyf0RR1OaJhSwElGYrF5rtHcAveTQB8pEph6xisTnhXhqIXylb/5Vr3pVqkG+5557UjacQi4x3GQR/peJz9lkUd3lYg5BpluQWH1UDiIIrD8uAyty0fhGPzHhd4ifSlJhlsn2uJLEGgIItKp1jWXVTiNumq6cMQhAmg6lfEkZztVXX51ACSiykJynpKJuFMDE07jsssuK1zByISN7ywIGms5xbBAC2s5P7F2cWQ4B73/mM59J1SsMg7HCTLPiqNsNUCJM41BoepbStEGwayGuPbdBm6KuErHRRXpFw9UWKrj22mtTJl7iZlbNZSnitunNjU6KsQB7kiTBgCRNH9dr5BwAISsfReYqAJypWLeyFjEsiaWYjB0lODL3/o0lrn7Q/y89pALweK5QATc3KgfGJPuncJ9iEacH4DwontgVV1yRvSlkUTl2bmOEKUYHSi8MGGl8TM4qGBooaUFCq/5L/R8NDmRWCgazMnwcDqbWFkejjs0sSHnOk570pLR+sd2hpy1tLbFwKTyxI8pPgiknCPndMrBCKwQd2RN8ZB1q7WLCeh+IJJ0MWQCUY5yn8xrzzMKSDcs/xgcq74p9FLsFjmpTb7755tHmUfYpCt6DSma+RwdKRGs5COCFycVChiZWIcATd1FkTnN6VtQEssr6dZWSP0pbZOHVAtZE0f3EXWNNaSWsgcTaAFUUqgOknLV2nqNSQuGx840xeFFhMRnDinXdeuutqR2PhzAGRQ1vDMwt7UbySADi0Ucfnf7OeACc1kL5UHAaLOxPqcn5ixDDJXgqzrmUdVkFUIqxsfgc2Lnnnpv6TnMOAeB+y+gpSzruuOPSv7EwgKIEBCJ8Wrpqiv8hpTiKbTGHtU4bXzcWKR+Ju5etK7frjVj5akiNnWMhrjQFyJ6dd955qZB6rBIX5NlCQeLlYoGT4/1yk4E1BljLaKMAG0AkBu8qZgqo5L3nixJ5NNVILSUZdv72b7sAShoiMrdxwVFOEufg+gBHm44IuEOI+KNsbZS51ESsNtcxYG5MXRNQUkIsXPtI6RC03Nre+cTlcK985StT4kZboNmjYoESJFzzmIfJqxjrRsEg6xLS4dYKDRD0kgpZIitqmHlyrDSWo2oOzQVKzWJoSG101113pa4gjSFkQVvxdgOUiJuLeaJ4OSfFLEebzs1YL8StlfDSfyuRIV4z1tSZWbFgk3C4vICplJKJwvGwKLXfER4A4HwBU9SZ1qD4KGGfsSaHA2Xekj2hOAA2BRf7VjN95StfSaAuwaRbrFSjQDVAydX1aTSbWB+yuHrnJbxYv6W6XhYl/fmlieVDgAi+Dz4aYx3rhXhU8gJiuhGDL+n6r4V4edYvPFCSqgHKRvNJskutKXdDK5mMb44C/UbbNrHIfNw+0GiddeY0WpyAo4zlrOt/GzVqNLJFqUjW7MhaSO1elMzI4tUSgO7XaOqiGaLrSABe2Ytst9+poHpr3jda/LjyNZ1lPz7tygTvWAtFb7b6zJr2jHcRZKzhmDc89kmyMcYSxlXRtWHGamjDPQtKmpl0jRo1arQt0aK41lzvRo0aNRrKomzUqFGj7ZWaRdmoUaNGc6gBZaNGjRrNoQaUjRo1ajSHGlA2atSo0RxqQNmoUaNGc6gBZaNGjRrNoQaUjRo1ajSHGlA2atSo0RxqQNmoUaNG3cr0vxcNTBSfgM1fAAAAAElFTkSuQmCC",
"text/plain": [
""
]
},
"metadata": {},
"output_type": "display_data"
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"GroundTruth: 7 2 1 0 4 1 4 9 5 9 0 6 9 0 1 5 9 7 3 4 9 6 6 5 4 0 7 4 0 1 3 1\n"
]
}
],
"source": [
"import torchvision\n",
"import torchvision.transforms as transforms\n",
"\n",
"BATCH_SIZE = 32\n",
"\n",
"transform = transforms.Compose(\n",
" [transforms.ToTensor(),\n",
" transforms.Normalize((0.5,), (0.5,))])\n",
"\n",
"trainset = torchvision.datasets.MNIST(root='../data', train=True,\n",
" download=True, transform=transform)\n",
"trainloader = torch.utils.data.DataLoader(trainset, batch_size=BATCH_SIZE,\n",
" shuffle=True, num_workers=2)\n",
"\n",
"testset = torchvision.datasets.MNIST(root='../data', train=False,\n",
" download=True, transform=transform)\n",
"testloader = torch.utils.data.DataLoader(testset, batch_size=BATCH_SIZE,\n",
" shuffle=False, num_workers=2)\n",
"\n",
"# get some random training images\n",
"dataiter = iter(trainloader)\n",
"images, labels = next(dataiter)\n",
"\n",
"dataiter = iter(testloader)\n",
"images, labels = next(dataiter)\n",
"\n",
"# print images\n",
"show_image(torchvision.utils.make_grid(images))\n",
"print('GroundTruth: ', ' '.join('%5s' % labels[j].item() for j in range(BATCH_SIZE)))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Each image has 1 channel (note the channel dimension is before the spatial dimensions), is 28x28 pixels, and our `images` tensor here has a batch of 32 of them:"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"torch.Size([32, 1, 28, 28])\n"
]
}
],
"source": [
"print(images.shape)"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {
"colab_type": "text",
"id": "BfN6oCjwrqR1"
},
"source": [
"## Example CNN Architecture\n",
"\n",
"Here's a diagram of the LeNet architecture:\n",
"\n",
"\n",
"Below is an example model provided to you. This architecture is similar, though not identical to the original LeNet architecture depicted above. It is pretty good and it reaches >98% accuracy on the test set, although better accuracy is quite possible.\n",
"\n",
"A few important things to notice about this architecture that are typical of CNN architectures:\n",
"* The network begins by alternating between:\n",
" * conv layers, which keep the spatial dimensions mostly the same, except for the few pixels lost to \"valid\" output size\n",
" * Some layer which reduces the spatial resolution.\n",
"* As the spatial dimensions get smaller, the channel (number of filters, or feature map depth) gets larger\n",
"* At some point, we start ignoring the spatial dimensions (conceptually \"unrolling the (h x w x c) feature map into a 1D vector of length (h*w*c), then apply some linear (fully-connected) layers.\n",
"* In this case, the layer that halves the spatial resolution is a 2x2 [max pooling](https://pytorch.org/docs/stable/generated/torch.nn.MaxPool2d.html) layer with stride 2, meaning it takes the max value in every (non-overlapping) 2x2 block of pixels."
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {
"colab": {
"autoexec": {
"startup": false,
"wait_interval": 0
}
},
"colab_type": "code",
"id": "jjvxIDrXrujd"
},
"outputs": [],
"source": [
"class ExampleModel(nn.Module):\n",
" def __init__(self):\n",
" super(ExampleModel, self).__init__()\n",
" # Convolution. Input channels: 1, output channels: 6, kernel size: 5\n",
" self.conv1 = nn.Conv2d(1, 6, 5)\n",
" # Max-pooling layer that will halve the HxW resolution\n",
" self.pool = nn.MaxPool2d(2, 2)\n",
" # Another 5x5 convolution that brings channel count up to 16\n",
" self.conv2 = nn.Conv2d(6, 16, 5)\n",
" \n",
" # Three fully connected layers\n",
" self.fc1 = nn.Linear(16 * 4 * 4, 60)\n",
" self.fc2 = nn.Linear(60, 40)\n",
" self.fc3 = nn.Linear(40, 10)\n",
"\n",
" def forward(self, x):\n",
" # Apply convolution, activation and pooling\n",
" # Output width after convolution = (input_width - (kernel_size - 1) / 2)\n",
" # Output width after pooling = input_width / 2\n",
" \n",
" # x.size() = Bx1x28x28\n",
" x = self.pool(F.relu(self.conv1(x)))\n",
" # x.size() = Bx6x12x12\n",
" x = self.pool(F.relu(self.conv2(x)))\n",
" # x.size() = Bx16x4x4\n",
" \n",
" # Flatten the output\n",
" x = x.view(-1, 16 * 4 * 4)\n",
" x = F.relu(self.fc1(x))\n",
" x = F.relu(self.fc2(x))\n",
" x = self.fc3(x)\n",
" return x\n",
"\n",
"\n",
"example_cnn = ExampleModel()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Parameter Counting\n",
"\n",
"In this exploration, we're going to pay particular attention to the number of parameters in a model - that is, how many weights do we need to store and learn when training the network?\n",
"\n",
"Here's the count for the `ExampleModel` above:"
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {
"colab": {
"autoexec": {
"startup": false,
"wait_interval": 0
}
},
"colab_type": "code",
"id": "jjvxIDrXrujd"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Model number of parameters: 20842\n"
]
}
],
"source": [
"print(f\"Model number of parameters: {get_n_params(example_cnn)}\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"And here's the parameter count for our tiny 3-layer MLP from last class:"
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Model number of parameters: 118282\n"
]
}
],
"source": [
"mlp = ML.MLP(28*28, 10) # 28x28 pixel input, 10-class output\n",
"print(f\"Model number of parameters: {get_n_params(mlp)}\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Notice: 3-layer MLP has a ton of parameters relative to a CNN with more layers! This is because conv layers learn a small number of weights that are applied sliding-window fashion across the entire input, regardless of its spatial dimensions. In contrast, the MLP has \"densely connected\" weights, where everything in one layer depends on everything in the prior layer.\n",
"\n",
"Let's look at the relative performance of these two models on MNIST:"
]
},
{
"cell_type": "markdown",
"metadata": {
"colab_type": "text",
"id": "7NBY2H8yrgRE"
},
"source": [
"## Training Loop\n",
"The following function trains a model."
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {
"colab": {
"autoexec": {
"startup": false,
"wait_interval": 0
}
},
"colab_type": "code",
"id": "EVzqqVy9rfLk"
},
"outputs": [],
"source": [
"PRINT_EVERY = 100\n",
"\n",
"def train_model(net):\n",
" \n",
" criterion = nn.CrossEntropyLoss()\n",
" optimizer = optim.Adam(net.parameters(), lr=0.001)\n",
" \n",
" net.to(device)\n",
" \n",
" net.train() # set the network in \"training mode\"\n",
" \n",
" for epoch in range(10): # loop over the dataset multiple times\n",
" \n",
" running_loss = 0.0\n",
" for i, data in enumerate(trainloader, 0):\n",
" # get the inputs\n",
" inputs, labels = data\n",
" \n",
" inputs = inputs.to(device)\n",
" labels = labels.to(device)\n",
" \n",
" # zero the parameter gradients\n",
" optimizer.zero_grad()\n",
" \n",
" # forward + backward + optimize\n",
" outputs = net(inputs)\n",
" loss = criterion(outputs, labels)\n",
" loss.backward()\n",
" optimizer.step()\n",
" \n",
" # print statistics\n",
" running_loss += loss\n",
" if i % PRINT_EVERY == PRINT_EVERY - 1: # print every PRINT_EVERY mini-batches\n",
" #show_image(torchvision.utils.make_grid(inputs.data))\n",
" print(f\"[{epoch + 1}, {i+1:5d}] loss: {running_loss/100:.3f}\", end=\"\\r\", flush=True)\n",
" running_loss = 0.0\n",
" \n",
" print('Finished Training')\n",
" return net"
]
},
{
"cell_type": "markdown",
"metadata": {
"colab_type": "text",
"id": "W0SLImjx1QE1"
},
"source": [
"## Testing\n",
"\n",
"The function below evaluates a trained model on the test set. If we were doing this for real, we should only run a model on the test set once, before publishing your results. In this assignment, we're re-using the test set, treating it more like a validation set."
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {
"colab": {
"autoexec": {
"startup": false,
"wait_interval": 0
}
},
"colab_type": "code",
"id": "jo1Zkn5EyNG7"
},
"outputs": [],
"source": [
"def test_model(net):\n",
" correct = 0\n",
" total = 0\n",
" \n",
" \n",
" with torch.no_grad():\n",
" net.eval()\n",
" for data in testloader:\n",
" images, labels = data\n",
" \n",
" # if linear_model:\n",
" # images = images.reshape((-1, 28*28))\n",
" \n",
" images = images.to(device)\n",
" labels = labels.to(device)\n",
" \n",
" outputs = net(images)\n",
" _, predicted = torch.max(outputs.data, 1)\n",
" total += labels.size(0)\n",
" correct += (predicted == labels).sum()\n",
" \n",
" acc = 100 * correct / total\n",
" \n",
" print(f\"# Parameters: {get_n_params(net)}\")\n",
" print(f'Accuracy of the network on the 10000 test images: {acc}%')\n",
" print(f'Correct: {correct}/{total}\\n')\n",
"\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# TODO\n",
"\n",
"1. Train the MLP, then test to see its performance.\n",
"2. Train the example CNN, then test to see its performance.\n",
"3. Modify the example model to either improve its performance or decrease its parameter count without hurting performance. Can you get over 99%? 99.5%?\n",
"\n",
"A few of ideas you can try:\n",
"* Different model structure (e.g. more layers, smaller/bigger kernels)\n",
"* Residual connections [0]\n",
"* Batch [2] / Layer Normalization [3]\n",
"* Densely connected architectures [1]\n",
"\n",
"[0] https://paperswithcode.com/method/residual-connection\n",
" \n",
"[1] Huang, G., Liu, Z., Weinberger, K. Q., & van der Maaten, L. (2017, July). Densely connected convolutional networks. In Proceedings of the IEEE conference on computer vision and pattern recognition (Vol. 1, No. 2, p. 3).\n",
"\n",
"[2] Ioffe, S., & Szegedy, C. (2015). Batch normalization: Accelerating deep network training by reducing internal covariate shift. arXiv preprint arXiv:1502.03167.\n",
"\n",
"[3] Ba, J. L., Kiros, J. R., & Hinton, G. E. (2016). Layer normalization. arXiv preprint arXiv:1607.06450.\n",
""
]
},
{
"cell_type": "code",
"execution_count": 10,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Finished Training 0.065\n"
]
}
],
"source": [
"# Train the MLP\n",
"mlp = ML.MLP(28*28, 10) # 28x28 pixel input, 10-class output\n",
"mlp = train_model(mlp)"
]
},
{
"cell_type": "code",
"execution_count": 11,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"# Parameters: 118282\n",
"Accuracy of the network on the 10000 test images: 97.12000274658203%\n",
"Correct: 9712/10000\n",
"\n"
]
}
],
"source": [
"test_model(mlp)"
]
},
{
"cell_type": "code",
"execution_count": 48,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Finished Training 0.024\n"
]
}
],
"source": [
"cnn = train_model(my_cnn)"
]
},
{
"cell_type": "code",
"execution_count": 49,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"# Parameters: 44590\n",
"Accuracy of the network on the 10000 test images: 98.61000061035156%\n",
"Correct: 9861/10000\n",
"\n"
]
}
],
"source": [
"test_model(cnn)"
]
},
{
"cell_type": "code",
"execution_count": 46,
"metadata": {
"colab": {
"autoexec": {
"startup": false,
"wait_interval": 0
}
},
"colab_type": "code",
"id": "k2CMdk8LKGdW"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Model number of parameters: 44590\n"
]
}
],
"source": [
"class YourModel(nn.Module):\n",
" def __init__(self):\n",
" super(YourModel, self).__init__()\n",
" # Convolution. Input channels: 1, output channels: 6, kernel size: 5\n",
" self.conv1 = nn.Conv2d(1, 16, 3, padding='same')\n",
" self.conv2 = nn.Conv2d(16, 16, 3, stride=2)\n",
" \n",
" self.conv3 = nn.Conv2d(16, 16, 3, padding='same')\n",
" self.conv4 = nn.Conv2d(16, 16, 3, stride=2)\n",
" \n",
" # Three fully connected layers\n",
" self.fc1 = nn.Linear(16 * 6 * 6, 60)\n",
" self.fc2 = nn.Linear(60, 40)\n",
" self.fc3 = nn.Linear(40, 10)\n",
"\n",
" def forward(self, x):\n",
" # Apply convolution, activation and pooling\n",
" # Output width after convolution = (input_width - (kernel_size - 1) / 2)\n",
" # Output width after pooling = input_width / 2\n",
" \n",
" # x.size() = Bx1x28x28\n",
" x = x + F.relu(self.conv1(x)) # residual connection!\n",
" # x.size() = Bx6x28x28\n",
" x = F.relu(self.conv2(x))\n",
" # x.size() = Bx12x13x13\n",
" x = x + F.relu(self.conv3(x)) # residual connection!\n",
" # x.size() = Bx16x13x13\n",
" x = F.relu(self.conv4(x))\n",
" # x.size() = Bx16x6x6\n",
" \n",
" # Flatten the output\n",
" x = x.view(-1, 16 * 6 * 6)\n",
" x = F.relu(self.fc1(x))\n",
" x = F.relu(self.fc2(x))\n",
" x = self.fc3(x)\n",
" return x\n",
"\n",
"my_cnn = YourModel()\n",
"\n",
"print(f\"Model number of parameters: {get_n_params(my_cnn)}\")"
]
},
{
"cell_type": "code",
"execution_count": 47,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"==========================================================================================\n",
"Layer (type:depth-idx) Output Shape Param #\n",
"==========================================================================================\n",
"YourModel [8, 10] --\n",
"├─Conv2d: 1-1 [8, 16, 28, 28] 160\n",
"├─Conv2d: 1-2 [8, 16, 13, 13] 2,320\n",
"├─Conv2d: 1-3 [8, 16, 13, 13] 2,320\n",
"├─Conv2d: 1-4 [8, 16, 6, 6] 2,320\n",
"├─Linear: 1-5 [8, 60] 34,620\n",
"├─Linear: 1-6 [8, 40] 2,440\n",
"├─Linear: 1-7 [8, 10] 410\n",
"==========================================================================================\n",
"Total params: 44,590\n",
"Trainable params: 44,590\n",
"Non-trainable params: 0\n",
"Total mult-adds (Units.MEGABYTES): 8.24\n",
"==========================================================================================\n",
"Input size (MB): 0.03\n",
"Forward/backward pass size (MB): 1.19\n",
"Params size (MB): 0.18\n",
"Estimated Total Size (MB): 1.40\n",
"=========================================================================================="
]
},
"execution_count": 47,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"import torchinfo\n",
"torchinfo.summary(my_cnn, input_size=(8, 1, 28, 28))"
]
},
{
"cell_type": "markdown",
"metadata": {
"colab_type": "text",
"id": "KmLl5haX4Om_"
},
"source": [
"### Per-class accuracy\n",
"\n",
"Run the below cell to see which digits your model is better at recognizing and which digits it gets confused by."
]
},
{
"cell_type": "code",
"execution_count": 19,
"metadata": {
"colab": {
"autoexec": {
"startup": false,
"wait_interval": 0
}
},
"colab_type": "code",
"id": "qp8POK0dyOKn"
},
"outputs": [
{
"ename": "RuntimeError",
"evalue": "shape '[-1, 784]' is invalid for input of size 18432",
"output_type": "error",
"traceback": [
"\u001b[31m---------------------------------------------------------------------------\u001b[39m",
"\u001b[31mRuntimeError\u001b[39m Traceback (most recent call last)",
"\u001b[36mCell\u001b[39m\u001b[36m \u001b[39m\u001b[32mIn[19]\u001b[39m\u001b[32m, line 11\u001b[39m\n\u001b[32m 8\u001b[39m images = images.to(device)\n\u001b[32m 9\u001b[39m labels = labels.to(device)\n\u001b[32m---> \u001b[39m\u001b[32m11\u001b[39m outputs = \u001b[43mnet\u001b[49m\u001b[43m(\u001b[49m\u001b[43mimages\u001b[49m\u001b[43m)\u001b[49m\n\u001b[32m 13\u001b[39m _, predicted = torch.max(outputs.data, \u001b[32m1\u001b[39m)\n\u001b[32m 14\u001b[39m c = (predicted == labels).squeeze()\n",
"\u001b[36mFile \u001b[39m\u001b[32m~/Documents/2610/1053/Lectures/.venv/lib/python3.12/site-packages/torch/nn/modules/module.py:1775\u001b[39m, in \u001b[36mModule._wrapped_call_impl\u001b[39m\u001b[34m(self, *args, **kwargs)\u001b[39m\n\u001b[32m 1773\u001b[39m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28mself\u001b[39m._compiled_call_impl(*args, **kwargs) \u001b[38;5;66;03m# type: ignore[misc]\u001b[39;00m\n\u001b[32m 1774\u001b[39m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[32m-> \u001b[39m\u001b[32m1775\u001b[39m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[43m.\u001b[49m\u001b[43m_call_impl\u001b[49m\u001b[43m(\u001b[49m\u001b[43m*\u001b[49m\u001b[43margs\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43m*\u001b[49m\u001b[43m*\u001b[49m\u001b[43mkwargs\u001b[49m\u001b[43m)\u001b[49m\n",
"\u001b[36mFile \u001b[39m\u001b[32m~/Documents/2610/1053/Lectures/.venv/lib/python3.12/site-packages/torch/nn/modules/module.py:1786\u001b[39m, in \u001b[36mModule._call_impl\u001b[39m\u001b[34m(self, *args, **kwargs)\u001b[39m\n\u001b[32m 1781\u001b[39m \u001b[38;5;66;03m# If we don't have any hooks, we want to skip the rest of the logic in\u001b[39;00m\n\u001b[32m 1782\u001b[39m \u001b[38;5;66;03m# this function, and just call forward.\u001b[39;00m\n\u001b[32m 1783\u001b[39m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;129;01mnot\u001b[39;00m (\u001b[38;5;28mself\u001b[39m._backward_hooks \u001b[38;5;129;01mor\u001b[39;00m \u001b[38;5;28mself\u001b[39m._backward_pre_hooks \u001b[38;5;129;01mor\u001b[39;00m \u001b[38;5;28mself\u001b[39m._forward_hooks \u001b[38;5;129;01mor\u001b[39;00m \u001b[38;5;28mself\u001b[39m._forward_pre_hooks\n\u001b[32m 1784\u001b[39m \u001b[38;5;129;01mor\u001b[39;00m _global_backward_pre_hooks \u001b[38;5;129;01mor\u001b[39;00m _global_backward_hooks\n\u001b[32m 1785\u001b[39m \u001b[38;5;129;01mor\u001b[39;00m _global_forward_hooks \u001b[38;5;129;01mor\u001b[39;00m _global_forward_pre_hooks):\n\u001b[32m-> \u001b[39m\u001b[32m1786\u001b[39m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[43mforward_call\u001b[49m\u001b[43m(\u001b[49m\u001b[43m*\u001b[49m\u001b[43margs\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43m*\u001b[49m\u001b[43m*\u001b[49m\u001b[43mkwargs\u001b[49m\u001b[43m)\u001b[49m\n\u001b[32m 1788\u001b[39m result = \u001b[38;5;28;01mNone\u001b[39;00m\n\u001b[32m 1789\u001b[39m called_always_called_hooks = \u001b[38;5;28mset\u001b[39m()\n",
"\u001b[36mCell\u001b[39m\u001b[36m \u001b[39m\u001b[32mIn[18]\u001b[39m\u001b[32m, line 32\u001b[39m, in \u001b[36mYourModel.forward\u001b[39m\u001b[34m(self, x)\u001b[39m\n\u001b[32m 28\u001b[39m x = F.relu(\u001b[38;5;28mself\u001b[39m.conv4(x))\n\u001b[32m 29\u001b[39m \u001b[38;5;66;03m# x.size() = Bx16x7x7\u001b[39;00m\n\u001b[32m 30\u001b[39m \n\u001b[32m 31\u001b[39m \u001b[38;5;66;03m# Flatten the output\u001b[39;00m\n\u001b[32m---> \u001b[39m\u001b[32m32\u001b[39m x = \u001b[43mx\u001b[49m\u001b[43m.\u001b[49m\u001b[43mview\u001b[49m\u001b[43m(\u001b[49m\u001b[43m-\u001b[49m\u001b[32;43m1\u001b[39;49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[32;43m16\u001b[39;49m\u001b[43m \u001b[49m\u001b[43m*\u001b[49m\u001b[43m \u001b[49m\u001b[32;43m7\u001b[39;49m\u001b[43m \u001b[49m\u001b[43m*\u001b[49m\u001b[43m \u001b[49m\u001b[32;43m7\u001b[39;49m\u001b[43m)\u001b[49m\n\u001b[32m 33\u001b[39m x = F.relu(\u001b[38;5;28mself\u001b[39m.fc1(x))\n\u001b[32m 34\u001b[39m x = F.relu(\u001b[38;5;28mself\u001b[39m.fc2(x))\n",
"\u001b[31mRuntimeError\u001b[39m: shape '[-1, 784]' is invalid for input of size 18432"
]
}
],
"source": [
"net = my_cnn\n",
"\n",
"class_correct = list(0. for i in range(10))\n",
"class_total = list(0. for i in range(10))\n",
"\n",
"for data in testloader:\n",
" images, labels = data\n",
" images = images.to(device)\n",
" labels = labels.to(device)\n",
" \n",
" outputs = net(images)\n",
" \n",
" _, predicted = torch.max(outputs.data, 1)\n",
" c = (predicted == labels).squeeze()\n",
" for i in range(4):\n",
" label = labels[i]\n",
" class_correct[label] += c[i]\n",
" class_total[label] += 1\n",
"\n",
"\n",
"for i in range(10):\n",
" print('Accuracy of %5s : %2d %%' % (i, 100 * class_correct[i] / class_total[i]))"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"accelerator": "GPU",
"colab": {
"collapsed_sections": [],
"default_view": {},
"name": "CS5670_Project5_MNISTChallenge.ipynb",
"provenance": [],
"version": "0.3.2",
"views": {}
},
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.12.12"
}
},
"nbformat": 4,
"nbformat_minor": 4
}