{"cells": [{"cell_type": "markdown", "id": "55e4cf7d", "metadata": {"papermill": {"duration": 0.014707, "end_time": "2025-04-03T20:48:48.717449", "exception": false, "start_time": "2025-04-03T20:48:48.702742", "status": "completed"}, "tags": []}, "source": ["\n", "# Fine-Tuning Scheduler\n", "\n", "* **Author:** [Dan Dale](https://github.com/speediedan)\n", "* **License:** CC BY-SA\n", "* **Generated:** 2025-04-03T20:48:41.528429\n", "\n", "This notebook introduces the [Fine-Tuning Scheduler](https://finetuning-scheduler.readthedocs.io/en/stable/index.html) extension\n", "and demonstrates the use of it to fine-tune a small foundation model on the\n", "[RTE](https://huggingface.co/datasets/viewer/?dataset=super_glue&config=rte) task of\n", "[SuperGLUE](https://super.gluebenchmark.com/) with iterative early-stopping defined according to a user-specified\n", "schedule. It uses Hugging Face's ``datasets`` and ``transformers`` libraries to retrieve the relevant benchmark data\n", "and foundation model weights. The required dependencies are installed via the finetuning-scheduler ``[examples]`` extra.\n", "\n", "\n", "---\n", "Open in [{height=\"20px\" width=\"117px\"}](https://colab.research.google.com/github/PytorchLightning/lightning-tutorials/blob/publication/.notebooks/lightning_examples/finetuning-scheduler.ipynb)\n", "\n", "Give us a \u2b50 [on Github](https://www.github.com/Lightning-AI/lightning/)\n", "| Check out [the documentation](https://lightning.ai/docs/)\n", "| Join us [on Discord](https://discord.com/invite/tfXFetEZxv)"]}, {"cell_type": "markdown", "id": "afe0d9b8", "metadata": {"papermill": {"duration": 0.012141, "end_time": "2025-04-03T20:48:48.742912", "exception": false, "start_time": "2025-04-03T20:48:48.730771", "status": "completed"}, "tags": []}, "source": ["## Setup\n", "This notebook requires some packages besides pytorch-lightning."]}, {"cell_type": "code", "execution_count": 1, "id": "be93e2d1", "metadata": {"colab": {}, "colab_type": "code", "execution": {"iopub.execute_input": "2025-04-03T20:48:48.768297Z", "iopub.status.busy": "2025-04-03T20:48:48.768120Z", "iopub.status.idle": "2025-04-03T20:48:50.443473Z", "shell.execute_reply": "2025-04-03T20:48:50.442383Z"}, "id": "LfrJLKPFyhsK", "lines_to_next_cell": 0, "papermill": {"duration": 1.69017, "end_time": "2025-04-03T20:48:50.445264", "exception": false, "start_time": "2025-04-03T20:48:48.755094", "status": "completed"}, "tags": []}, "outputs": [{"name": "stdout", "output_type": "stream", "text": ["\u001b[33mWARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager, possibly rendering your system unusable.It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv. Use the --root-user-action option if you know what you are doing and want to suppress this warning.\u001b[0m\u001b[33m\r\n", "\u001b[0m"]}, {"name": "stdout", "output_type": "stream", "text": ["\r\n", "\u001b[1m[\u001b[0m\u001b[34;49mnotice\u001b[0m\u001b[1;39;49m]\u001b[0m\u001b[39;49m A new release of pip is available: \u001b[0m\u001b[31;49m24.2\u001b[0m\u001b[39;49m -> \u001b[0m\u001b[32;49m25.0.1\u001b[0m\r\n", "\u001b[1m[\u001b[0m\u001b[34;49mnotice\u001b[0m\u001b[1;39;49m]\u001b[0m\u001b[39;49m To update, run: \u001b[0m\u001b[32;49mpython -m pip install --upgrade pip\u001b[0m\r\n"]}], "source": ["! pip install --quiet \"torch>=1.8.1, <2.7\" \"pytorch-lightning >=2.0,<2.6\" \"datasets >=2.17.0\" \"matplotlib\" \"finetuning-scheduler[examples] <=2.5.0\" \"torchmetrics>=1.0, <1.8\" \"numpy <3.0\""]}, {"cell_type": "markdown", "id": "1336f18c", "metadata": {"papermill": {"duration": 0.012341, "end_time": "2025-04-03T20:48:50.470569", "exception": false, "start_time": "2025-04-03T20:48:50.458228", "status": "completed"}, "tags": []}, "source": ["## Scheduled Fine-Tuning with the Fine-Tuning Scheduler Extension\n", "\n", "{height=\"55px\" width=\"401px\"}\n", "\n", "The [Fine-Tuning Scheduler](https://finetuning-scheduler.readthedocs.io/en/stable/index.html) extension accelerates and enhances model experimentation with flexible fine-tuning schedules.\n", "\n", "Training with the extension is simple and confers a host of benefits:\n", "\n", "- it dramatically increases fine-tuning flexibility\n", "- expedites and facilitates exploration of model tuning dynamics\n", "- enables marginal performance improvements of fine-tuned models\n", "\n", "Setup is straightforward, just install from PyPI! Since this notebook-based example requires a few additional packages (e.g.\n", "``transformers``, ``sentencepiece``), we installed the ``finetuning-scheduler`` package with the ``[examples]`` extra above.\n", "Once the ``finetuning-scheduler`` package is installed, the [FinetuningScheduler](https://finetuning-scheduler.readthedocs.io/en/stable/api/finetuning_scheduler.fts.html#finetuning_scheduler.fts.FinetuningScheduler) callback (FTS) is available for use with Lightning.\n", "For additional installation options, please see the Fine-Tuning Scheduler [README](https://github.com/speediedan/finetuning-scheduler/blob/main/README.md).\n", "\n", "\n", "\n", "