{"cells": [{"cell_type": "markdown", "id": "8bbf9e00", "metadata": {"papermill": {"duration": 0.020091, "end_time": "2025-05-01T10:49:12.776432", "exception": false, "start_time": "2025-05-01T10:49:12.756341", "status": "completed"}, "tags": []}, "source": ["\n", "# Tutorial 13: Self-Supervised Contrastive Learning with SimCLR\n", "\n", "* **Author:** Phillip Lippe\n", "* **License:** CC BY-SA\n", "* **Generated:** 2025-05-01T10:49:06.093194\n", "\n", "In this tutorial, we will take a closer look at self-supervised contrastive learning.\n", "Self-supervised learning, or also sometimes called unsupervised learning, describes the scenario where we have given input data, but no accompanying labels to train in a classical supervised way.\n", "However, this data still contains a lot of information from which we can learn: how are the images different from each other?\n", "What patterns are descriptive for certain images?\n", "Can we cluster the images?\n", "To get an insight into these questions, we will implement a popular, simple contrastive learning method, SimCLR, and apply it to the STL10 dataset.\n", "This notebook is part of a lecture series on Deep Learning at the University of Amsterdam.\n", "The full list of tutorials can be found at https://uvadlc-notebooks.rtfd.io.\n", "\n", "\n", "---\n", "Open in [{height=\"20px\" width=\"117px\"}](https://colab.research.google.com/github/PytorchLightning/lightning-tutorials/blob/publication/.notebooks/course_UvA-DL/13-contrastive-learning.ipynb)\n", "\n", "Give us a \u2b50 [on Github](https://www.github.com/Lightning-AI/lightning/)\n", "| Check out [the documentation](https://lightning.ai/docs/)\n", "| Join us [on Discord](https://discord.com/invite/tfXFetEZxv)"]}, {"cell_type": "markdown", "id": "01bf38ce", "metadata": {"papermill": {"duration": 0.01942, "end_time": "2025-05-01T10:49:12.814069", "exception": false, "start_time": "2025-05-01T10:49:12.794649", "status": "completed"}, "tags": []}, "source": ["## Setup\n", "This notebook requires some packages besides pytorch-lightning."]}, {"cell_type": "code", "execution_count": 1, "id": "b4fbbbef", "metadata": {"colab": {}, "colab_type": "code", "execution": {"iopub.execute_input": "2025-05-01T10:49:12.845254Z", "iopub.status.busy": "2025-05-01T10:49:12.844891Z", "iopub.status.idle": "2025-05-01T10:49:14.040285Z", "shell.execute_reply": "2025-05-01T10:49:14.038918Z"}, "id": "LfrJLKPFyhsK", "lines_to_next_cell": 0, "papermill": {"duration": 1.212848, "end_time": "2025-05-01T10:49:14.042942", "exception": false, "start_time": "2025-05-01T10:49:12.830094", "status": "completed"}, "tags": []}, "outputs": [{"name": "stdout", "output_type": "stream", "text": ["\u001b[33mWARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager, possibly rendering your system unusable.It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv. Use the --root-user-action option if you know what you are doing and want to suppress this warning.\u001b[0m\u001b[33m\r\n", "\u001b[0m"]}, {"name": "stdout", "output_type": "stream", "text": ["\r\n", "\u001b[1m[\u001b[0m\u001b[34;49mnotice\u001b[0m\u001b[1;39;49m]\u001b[0m\u001b[39;49m A new release of pip is available: \u001b[0m\u001b[31;49m24.2\u001b[0m\u001b[39;49m -> \u001b[0m\u001b[32;49m25.1\u001b[0m\r\n", "\u001b[1m[\u001b[0m\u001b[34;49mnotice\u001b[0m\u001b[1;39;49m]\u001b[0m\u001b[39;49m To update, run: \u001b[0m\u001b[32;49mpython -m pip install --upgrade pip\u001b[0m\r\n"]}], "source": ["! pip install --quiet \"numpy <3.0\" \"torch >=1.8.1,<2.8\" \"torchvision\" \"seaborn\" \"matplotlib\" \"torchmetrics >=1.0,<1.8\" \"pytorch-lightning >=2.0,<2.6\" \"tensorboard\""]}, {"cell_type": "markdown", "id": "881dc613", "metadata": {"papermill": {"duration": 0.026635, "end_time": "2025-05-01T10:49:14.100276", "exception": false, "start_time": "2025-05-01T10:49:14.073641", "status": "completed"}, "tags": []}, "source": ["