{"cells": [{"cell_type": "markdown", "id": "876a23ac", "metadata": {"papermill": {"duration": 0.016786, "end_time": "2025-04-08T10:53:38.287561", "exception": false, "start_time": "2025-04-08T10:53:38.270775", "status": "completed"}, "tags": []}, "source": ["\n", "# Tutorial 13: Self-Supervised Contrastive Learning with SimCLR\n", "\n", "* **Author:** Phillip Lippe\n", "* **License:** CC BY-SA\n", "* **Generated:** 2025-04-08T10:53:30.432958\n", "\n", "In this tutorial, we will take a closer look at self-supervised contrastive learning.\n", "Self-supervised learning, or also sometimes called unsupervised learning, describes the scenario where we have given input data, but no accompanying labels to train in a classical supervised way.\n", "However, this data still contains a lot of information from which we can learn: how are the images different from each other?\n", "What patterns are descriptive for certain images?\n", "Can we cluster the images?\n", "To get an insight into these questions, we will implement a popular, simple contrastive learning method, SimCLR, and apply it to the STL10 dataset.\n", "This notebook is part of a lecture series on Deep Learning at the University of Amsterdam.\n", "The full list of tutorials can be found at https://uvadlc-notebooks.rtfd.io.\n", "\n", "\n", "---\n", "Open in [{height=\"20px\" width=\"117px\"}](https://colab.research.google.com/github/PytorchLightning/lightning-tutorials/blob/publication/.notebooks/course_UvA-DL/13-contrastive-learning.ipynb)\n", "\n", "Give us a \u2b50 [on Github](https://www.github.com/Lightning-AI/lightning/)\n", "| Check out [the documentation](https://lightning.ai/docs/)\n", "| Join us [on Discord](https://discord.com/invite/tfXFetEZxv)"]}, {"cell_type": "markdown", "id": "c3ab1d02", "metadata": {"papermill": {"duration": 0.015903, "end_time": "2025-04-08T10:53:38.318512", "exception": false, "start_time": "2025-04-08T10:53:38.302609", "status": "completed"}, "tags": []}, "source": ["## Setup\n", "This notebook requires some packages besides pytorch-lightning."]}, {"cell_type": "code", "execution_count": 1, "id": "ac79bbd2", "metadata": {"colab": {}, "colab_type": "code", "execution": {"iopub.execute_input": "2025-04-08T10:53:38.360594Z", "iopub.status.busy": "2025-04-08T10:53:38.355316Z", "iopub.status.idle": "2025-04-08T10:53:39.724071Z", "shell.execute_reply": "2025-04-08T10:53:39.723154Z"}, "id": "LfrJLKPFyhsK", "lines_to_next_cell": 0, "papermill": {"duration": 1.392574, "end_time": "2025-04-08T10:53:39.726004", "exception": false, "start_time": "2025-04-08T10:53:38.333430", "status": "completed"}, "tags": []}, "outputs": [{"name": "stdout", "output_type": "stream", "text": ["\u001b[33mWARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager, possibly rendering your system unusable.It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv. Use the --root-user-action option if you know what you are doing and want to suppress this warning.\u001b[0m\u001b[33m\r\n", "\u001b[0m"]}, {"name": "stdout", "output_type": "stream", "text": ["\r\n", "\u001b[1m[\u001b[0m\u001b[34;49mnotice\u001b[0m\u001b[1;39;49m]\u001b[0m\u001b[39;49m A new release of pip is available: \u001b[0m\u001b[31;49m24.2\u001b[0m\u001b[39;49m -> \u001b[0m\u001b[32;49m25.0.1\u001b[0m\r\n", "\u001b[1m[\u001b[0m\u001b[34;49mnotice\u001b[0m\u001b[1;39;49m]\u001b[0m\u001b[39;49m To update, run: \u001b[0m\u001b[32;49mpython -m pip install --upgrade pip\u001b[0m\r\n"]}], "source": ["! pip install --quiet \"torch >=1.8.1,<2.7\" \"torchmetrics >=1.0,<1.8\" \"matplotlib\" \"tensorboard\" \"seaborn\" \"pytorch-lightning >=2.0,<2.6\" \"numpy <3.0\" \"torchvision\""]}, {"cell_type": "markdown", "id": "4035cef8", "metadata": {"papermill": {"duration": 0.014805, "end_time": "2025-04-08T10:53:39.757210", "exception": false, "start_time": "2025-04-08T10:53:39.742405", "status": "completed"}, "tags": []}, "source": ["