{"cells": [{"cell_type": "markdown", "id": "4dd83edf", "metadata": {"papermill": {"duration": 0.012933, "end_time": "2025-04-08T10:51:28.365208", "exception": false, "start_time": "2025-04-08T10:51:28.352275", "status": "completed"}, "tags": []}, "source": ["\n", "# Tutorial 4: Inception, ResNet and DenseNet\n", "\n", "* **Author:** Phillip Lippe\n", "* **License:** CC BY-SA\n", "* **Generated:** 2025-04-08T10:51:20.594274\n", "\n", "In this tutorial, we will implement and discuss variants of modern CNN architectures.\n", "There have been many different architectures been proposed over the past few years.\n", "Some of the most impactful ones, and still relevant today, are the following: [GoogleNet](https://arxiv.org/abs/1409.4842)/Inception architecture (winner of ILSVRC 2014), [ResNet](https://arxiv.org/abs/1512.03385) (winner of ILSVRC 2015), and [DenseNet](https://arxiv.org/abs/1608.06993) (best paper award CVPR 2017).\n", "All of them were state-of-the-art models when being proposed, and the core ideas of these networks are the foundations for most current state-of-the-art architectures.\n", "Thus, it is important to understand these architectures in detail and learn how to implement them.\n", "This notebook is part of a lecture series on Deep Learning at the University of Amsterdam.\n", "The full list of tutorials can be found at https://uvadlc-notebooks.rtfd.io.\n", "\n", "\n", "---\n", "Open in [{height=\"20px\" width=\"117px\"}](https://colab.research.google.com/github/PytorchLightning/lightning-tutorials/blob/publication/.notebooks/course_UvA-DL/04-inception-resnet-densenet.ipynb)\n", "\n", "Give us a \u2b50 [on Github](https://www.github.com/Lightning-AI/lightning/)\n", "| Check out [the documentation](https://lightning.ai/docs/)\n", "| Join us [on Discord](https://discord.com/invite/tfXFetEZxv)"]}, {"cell_type": "markdown", "id": "7319661c", "metadata": {"papermill": {"duration": 0.012008, "end_time": "2025-04-08T10:51:28.389002", "exception": false, "start_time": "2025-04-08T10:51:28.376994", "status": "completed"}, "tags": []}, "source": ["## Setup\n", "This notebook requires some packages besides pytorch-lightning."]}, {"cell_type": "code", "execution_count": 1, "id": "7aef20a9", "metadata": {"colab": {}, "colab_type": "code", "execution": {"iopub.execute_input": "2025-04-08T10:51:28.411059Z", "iopub.status.busy": "2025-04-08T10:51:28.410649Z", "iopub.status.idle": "2025-04-08T10:51:29.796085Z", "shell.execute_reply": "2025-04-08T10:51:29.795377Z"}, "id": "LfrJLKPFyhsK", "lines_to_next_cell": 0, "papermill": {"duration": 1.399062, "end_time": "2025-04-08T10:51:29.797936", "exception": false, "start_time": "2025-04-08T10:51:28.398874", "status": "completed"}, "tags": []}, "outputs": [{"name": "stdout", "output_type": "stream", "text": ["\u001b[33mWARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager, possibly rendering your system unusable.It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv. Use the --root-user-action option if you know what you are doing and want to suppress this warning.\u001b[0m\u001b[33m\r\n", "\u001b[0m"]}, {"name": "stdout", "output_type": "stream", "text": ["\r\n", "\u001b[1m[\u001b[0m\u001b[34;49mnotice\u001b[0m\u001b[1;39;49m]\u001b[0m\u001b[39;49m A new release of pip is available: \u001b[0m\u001b[31;49m24.2\u001b[0m\u001b[39;49m -> \u001b[0m\u001b[32;49m25.0.1\u001b[0m\r\n", "\u001b[1m[\u001b[0m\u001b[34;49mnotice\u001b[0m\u001b[1;39;49m]\u001b[0m\u001b[39;49m To update, run: \u001b[0m\u001b[32;49mpython -m pip install --upgrade pip\u001b[0m\r\n"]}], "source": ["! pip install --quiet \"matplotlib\" \"torchmetrics >=1.0,<1.8\" \"tensorboard\" \"pytorch-lightning >=2.0,<2.6\" \"torch >=1.8.1,<2.7\" \"numpy <3.0\" \"seaborn\" \"tabulate\" \"torchvision\""]}, {"cell_type": "markdown", "id": "9e2a51a2", "metadata": {"papermill": {"duration": 0.009838, "end_time": "2025-04-08T10:51:29.818376", "exception": false, "start_time": "2025-04-08T10:51:29.808538", "status": "completed"}, "tags": []}, "source": ["
Model | Val Accuracy | Test Accuracy | Num Parameters |
---|---|---|---|
GoogleNet | 90.40% | 89.71% | 260,650 |
ResNet | 91.84% | 91.06% | 272,378 |
ResNetPreAct | 91.80% | 91.07% | 272,250 |
DenseNet | 90.72% | 90.23% | 239,146 |