diff --git a/config_auth_cds.ipynb b/config_auth_cds.ipynb
index 033b05c..38e3c80 100644
--- a/config_auth_cds.ipynb
+++ b/config_auth_cds.ipynb
@@ -4,7 +4,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
- "### d4science_copernicus_cds Library Setup and Example\n",
+ "### d4science_copernicus_cds Library - Setup and Example\n",
"\n",
"This Jupyter notebook will guide you through setting up all dependencies and configuring the environment to use the `d4science_copernicus_cds` library. It also provides a comprehensive example of the library's features and capabilities, helping you to manage Climate Data Store (CDS) API authentication and make programmatic requests from the CDS.\n",
"\n",
@@ -14,8 +14,8 @@
"\n",
"To begin, you’ll need your CDS API credentials. Follow these steps to obtain them:\n",
"\n",
- "1. Register or log in to the CDS at [https://cds-beta.climate.copernicus.eu](https://cds-beta.climate.copernicus.eu).\n",
- "2. Visit [https://cds-beta.climate.copernicus.eu/how-to-api](https://cds-beta.climate.copernicus.eu/how-to-api) and copy the API key provided.\n",
+ "1. Register or log in to the CDS at [https://cds.climate.copernicus.eu](https://cds.climate.copernicus.eu).\n",
+ "2. Visit [https://cds.climate.copernicus.eu/how-to-api](https://cds-beta.climate.copernicus.eu/how-to-api) and copy the API key provided.\n",
"\n",
"The library will prompt you to enter:\n",
"- **URL**: The URL field is prefilled; simply press Enter to accept the default.\n",
@@ -62,6 +62,20 @@
"!pip install numpy matplotlib cartopy xarray netCDF4 cdsapi"
]
},
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "!pip install \"numpy>=1.16.5,<1.23.0\"\n",
+ "!pip install pandas xarray \n",
+ "!pip install cdsapi\n",
+ "!pip install matplotlib cartopy\n",
+ "!pip install cfgrib\n",
+ "!pip install xskillscore"
+ ]
+ },
{
"cell_type": "code",
"execution_count": null,
@@ -80,27 +94,6 @@
"!pip install zarr dask fsspec"
]
},
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {},
- "outputs": [],
- "source": [
- "!pip install numpy pandas xarray xskillscore\n",
- "!pip install cdsapi\n",
- "!pip install matplotlib cartopy\n",
- "!pip install cfgrib"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {},
- "outputs": [],
- "source": [
- "!pip install cfgrib ecCodes"
- ]
- },
{
"cell_type": "markdown",
"metadata": {},
diff --git a/tutorials/01_reanalysis/01x01_reanalysis-climatology.ipynb b/tutorials/01_reanalysis/01x01_reanalysis-climatology.ipynb
index 0f22faf..6a1713d 100644
--- a/tutorials/01_reanalysis/01x01_reanalysis-climatology.ipynb
+++ b/tutorials/01_reanalysis/01x01_reanalysis-climatology.ipynb
@@ -33,8 +33,8 @@
"The library prompts us to enter our credentials, which are then securely saved in our workspace. **This request is only made the first time**; afterward, the `get_credentials` function will automatically retrieve the credentials from the environment or workspace, eliminating the need to re-enter them in the Jupyter notebook.\n",
"\n",
"To obtain your API credentials:\n",
- "1. Register or log in to the CDS at [https://cds-beta.climate.copernicus.eu](https://cds-beta.climate.copernicus.eu).\n",
- "2. Visit [https://cds-beta.climate.copernicus.eu/how-to-api](https://cds-beta.climate.copernicus.eu/how-to-api) and copy the API key provided.\n",
+ "1. Register or log in to the CDS at [https://cds.climate.copernicus.eu](https://cds-beta.climate.copernicus.eu).\n",
+ "2. Visit [https://cds.climate.copernicus.eu/how-to-api](https://cds-beta.climate.copernicus.eu/how-to-api) and copy the API key provided.\n",
"\n",
"The library will prompt you to enter:\n",
"- **URL**: The URL field is prefilled; simply press Enter to accept the default.\n",
diff --git a/tutorials/01_reanalysis/01x02_reanalysis-temp-record.ipynb b/tutorials/01_reanalysis/01x02_reanalysis-temp-record.ipynb
index 3e9673a..95677f0 100644
--- a/tutorials/01_reanalysis/01x02_reanalysis-temp-record.ipynb
+++ b/tutorials/01_reanalysis/01x02_reanalysis-temp-record.ipynb
@@ -2,7 +2,7 @@
"cells": [
{
"cell_type": "markdown",
- "id": "8130d058",
+ "id": "13ecb88a",
"metadata": {},
"source": [
"# Tutorial on July 2023 record-breaking global surface temperatures using climate data from C3S"
@@ -10,7 +10,7 @@
},
{
"cell_type": "markdown",
- "id": "65b72b64",
+ "id": "d3cbbf9e",
"metadata": {},
"source": [
"### About\n",
@@ -31,7 +31,7 @@
},
{
"cell_type": "markdown",
- "id": "771a66cc",
+ "id": "b532b47d",
"metadata": {},
"source": [
"### d4science_copernicus_cds Library\n",
@@ -41,8 +41,8 @@
"The library prompts us to enter our credentials, which are then securely saved in our workspace. **This request is only made the first time**; afterward, the `get_credentials` function will automatically retrieve the credentials from the environment or workspace, eliminating the need to re-enter them in the Jupyter notebook.\n",
"\n",
"To obtain your API credentials:\n",
- "1. Register or log in to the CDS at [https://cds-beta.climate.copernicus.eu](https://cds-beta.climate.copernicus.eu).\n",
- "2. Visit [https://cds-beta.climate.copernicus.eu/how-to-api](https://cds-beta.climate.copernicus.eu/how-to-api) and copy the API key provided.\n",
+ "1. Register or log in to the CDS at [https://cds.climate.copernicus.eu](https://cds.climate.copernicus.eu).\n",
+ "2. Visit [https://cds.climate.copernicus.eu/how-to-api](https://cds.climate.copernicus.eu/how-to-api) and copy the API key provided.\n",
"\n",
"The library will prompt you to enter:\n",
"- **URL**: The URL field is prefilled; simply press Enter to accept the default.\n",
@@ -53,7 +53,7 @@
},
{
"cell_type": "markdown",
- "id": "fe7a0949",
+ "id": "96edd04b",
"metadata": {},
"source": [
"This tutorial is based on the official turorial **[CDS API guide](https://ecmwf-projects.github.io/copernicus-training-c3s/reanalysis-temp-record.html)**, extended and adapted for use in the **BlueCloud JupyterLab** environment."
@@ -62,7 +62,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "f07ec047",
+ "id": "de20e7af",
"metadata": {},
"outputs": [],
"source": [
@@ -72,7 +72,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "e10262fc",
+ "id": "d1c7ec56",
"metadata": {},
"outputs": [],
"source": [
@@ -82,7 +82,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "4bd3fb71",
+ "id": "50290c4a",
"metadata": {},
"outputs": [],
"source": [
@@ -97,7 +97,7 @@
},
{
"cell_type": "markdown",
- "id": "692a7b7c",
+ "id": "ae061938",
"metadata": {},
"source": [
"cds_datadir will create a folder in our workspace, under cds_dataDir, with current timestamp and custom label"
@@ -106,7 +106,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "4e3a9659",
+ "id": "1b982ef2",
"metadata": {},
"outputs": [],
"source": [
@@ -116,7 +116,7 @@
},
{
"cell_type": "markdown",
- "id": "07fcf992",
+ "id": "91801237",
"metadata": {},
"source": [
"## 1. Search, download and view data"
@@ -124,7 +124,7 @@
},
{
"cell_type": "markdown",
- "id": "54ad2fba",
+ "id": "156c3253",
"metadata": {},
"source": [
"Before we begin we must prepare our environment. This includes installing the Application Programming Interface (API) of the CDS as well as other required libs, and importing the various python libraries that we will need."
@@ -132,7 +132,7 @@
},
{
"cell_type": "markdown",
- "id": "fc93afc0",
+ "id": "d50ba2bb",
"metadata": {},
"source": [
"#### Infrastructure introduction (installing API of the CDS)"
@@ -140,7 +140,7 @@
},
{
"cell_type": "markdown",
- "id": "d8db144f",
+ "id": "77b0e14d",
"metadata": {},
"source": [
"In this exercise we will mainly use `cdsapi`, `xarray`, `matplotlib` and `cartopy` python libraries."
@@ -148,7 +148,7 @@
},
{
"cell_type": "markdown",
- "id": "5638a7c8",
+ "id": "37359183",
"metadata": {},
"source": [
"There are several options to run the code in this tutorial:\n",
@@ -158,7 +158,7 @@
},
{
"cell_type": "markdown",
- "id": "2a0767bd",
+ "id": "07dbb0ba",
"metadata": {},
"source": [
"#### Installation on your computer"
@@ -166,7 +166,7 @@
},
{
"cell_type": "markdown",
- "id": "91ee485f",
+ "id": "ed7fb023",
"metadata": {},
"source": [
"First of all, in order to run this notebook on your computer you need to install Python and the required libs."
@@ -174,7 +174,7 @@
},
{
"cell_type": "markdown",
- "id": "2875c115",
+ "id": "7d330dd7",
"metadata": {},
"source": [
"The easiest way to install Python without interfering with other potential Python installations on your system is by using [Miniconda, Miniforge or Mambaforge](https://github.com/conda-forge/miniforge/blob/main/README.md). This will install a modern Python for your user and the **Conda**/**Mamba** package manager. **Mamba** is a performant drop-in replacement for **Conda**."
@@ -182,7 +182,7 @@
},
{
"cell_type": "markdown",
- "id": "aeb3656c",
+ "id": "19dc01a7",
"metadata": {},
"source": [
"Once Python + **Conda**/**Mamba** are installed run the following from the command line to install the API of the CDS, `cdsapi`, and the rest of the requirements:\n",
@@ -198,7 +198,7 @@
},
{
"cell_type": "markdown",
- "id": "4a2a46d7",
+ "id": "d233a39d",
"metadata": {},
"source": [
"If everything is installed correctly run the following from the command line:\n",
@@ -212,7 +212,7 @@
},
{
"cell_type": "markdown",
- "id": "eb3c1071",
+ "id": "72c853f1",
"metadata": {},
"source": [
"#### Running on Colab or Kaggle"
@@ -220,7 +220,7 @@
},
{
"cell_type": "markdown",
- "id": "c6fe3496",
+ "id": "3c27152f",
"metadata": {},
"source": [
"If you are on Colab or Kaggle just run the following line of code to install the API of the CDS and the rest of the dependencies before running the rest of the code:"
@@ -229,7 +229,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "73fdf414",
+ "id": "6aedac61",
"metadata": {},
"outputs": [],
"source": [
@@ -238,7 +238,7 @@
},
{
"cell_type": "markdown",
- "id": "779f51f5",
+ "id": "c4791e4d",
"metadata": {},
"source": [
"#### Import libraries"
@@ -246,7 +246,7 @@
},
{
"cell_type": "markdown",
- "id": "be67dade",
+ "id": "88175395",
"metadata": {},
"source": [
"We will start importing the required libraries. These libs should be already installed. If you have not installed the requirements, please go to the specific section above."
@@ -255,7 +255,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "c144aa3f",
+ "id": "1d2427b2",
"metadata": {},
"outputs": [],
"source": [
@@ -273,18 +273,18 @@
},
{
"cell_type": "markdown",
- "id": "afd220dd",
+ "id": "4898e76d",
"metadata": {},
"source": [
"#### Search for data\n",
"\n",
- "To search for data, visit the CDS website: https://cds-beta.climate.copernicus.eu.\n",
- "Here you can search for ERA5 data using the search bar. The data we need for this tutorial is the [ERA5 monthly averaged data on single levels from 1940 to present](https://cds-beta.climate.copernicus.eu/datasets/reanalysis-era5-single-levels-monthly-means?tab=overview). ERA5 is the 5th version of the ECMWF Reanalysis dataset. Reanalysis uses a state of the art forecast model and data assimilation system to create a consistent \"map without gaps\" of observed and modelled climate variables over the past decades."
+ "To search for data, visit the CDS website: https://cds.climate.copernicus.eu.\n",
+ "Here you can search for ERA5 data using the search bar. The data we need for this tutorial is the [ERA5 monthly averaged data on single levels from 1940 to present](https://cds.climate.copernicus.eu/datasets/reanalysis-era5-single-levels-monthly-means?tab=overview). ERA5 is the 5th version of the ECMWF Reanalysis dataset. Reanalysis uses a state of the art forecast model and data assimilation system to create a consistent \"map without gaps\" of observed and modelled climate variables over the past decades."
]
},
{
"cell_type": "markdown",
- "id": "939b2301",
+ "id": "f0e76cbd",
"metadata": {},
"source": [
"Having selected the correct dataset, we now need to specify what product type, variables, temporal and geographic coverage we are interested in. These can all be selected in the **\"Download data\"** tab. In this tab a form appears in which we will select the following parameters to download:\n",
@@ -302,7 +302,7 @@
},
{
"cell_type": "markdown",
- "id": "6e755a8b",
+ "id": "d1d8042c",
"metadata": {},
"source": [
"
"
@@ -310,7 +310,7 @@
},
{
"cell_type": "markdown",
- "id": "f9755801",
+ "id": "1925d01d",
"metadata": {},
"source": [
"#### Download data\n",
@@ -321,7 +321,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "9d8818a8",
+ "id": "1357d07b",
"metadata": {},
"outputs": [],
"source": [
@@ -372,7 +372,7 @@
},
{
"cell_type": "markdown",
- "id": "7ed10fdc",
+ "id": "398f92b7",
"metadata": {},
"source": [
"#### Inspect data\n",
@@ -383,7 +383,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "8938c920",
+ "id": "a3b8a02d",
"metadata": {},
"outputs": [],
"source": [
@@ -392,7 +392,7 @@
},
{
"cell_type": "markdown",
- "id": "b2f8d3c9",
+ "id": "e74ef8d7",
"metadata": {},
"source": [
"Now we can query our newly created Xarray dataset... Let's have a look at the `ds`."
@@ -401,7 +401,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "e32a3886",
+ "id": "949c4e42",
"metadata": {},
"outputs": [],
"source": [
@@ -410,7 +410,7 @@
},
{
"cell_type": "markdown",
- "id": "7c42c6e1",
+ "id": "c70224de",
"metadata": {},
"source": [
"We see that the dataset has one variable called `t2m`, which stands for \"2 metre temperature\", and three coordinates of `longitude`, `latitude` and `time`. \n",
@@ -428,7 +428,7 @@
},
{
"cell_type": "markdown",
- "id": "a99342c1",
+ "id": "2245a08b",
"metadata": {},
"source": [
"There is also an `expver` coordinate. More on this later."
@@ -436,7 +436,7 @@
},
{
"cell_type": "markdown",
- "id": "b1978d15",
+ "id": "506abc02",
"metadata": {},
"source": [
"Select the icons to the right of the table above to expand the attributes of the coordinates and data variables. What are the units of the temperature data?"
@@ -444,7 +444,7 @@
},
{
"cell_type": "markdown",
- "id": "3dc62489",
+ "id": "c768b976",
"metadata": {},
"source": [
"While an Xarray dataset may contain multiple variables, an Xarray data array holds a single multi-dimensional variable and its coordinates. To make the processing of the `t2m` data easier, we convert it into an Xarray data array. We will call it `da_tmp` (a temporary data array) because we will transform the data in some ways."
@@ -453,7 +453,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "da8db3c5",
+ "id": "b4463f87",
"metadata": {},
"outputs": [],
"source": [
@@ -462,7 +462,7 @@
},
{
"cell_type": "markdown",
- "id": "d2a03104",
+ "id": "09b4609f",
"metadata": {},
"source": [
"Let's view this data:"
@@ -471,7 +471,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "e9e1cc50",
+ "id": "c4fe17f8",
"metadata": {},
"outputs": [],
"source": [
@@ -480,7 +480,7 @@
},
{
"cell_type": "markdown",
- "id": "17c621f8",
+ "id": "2cde7606",
"metadata": {},
"source": [
"From the result of the cell above you can see that now we have a `xarray.DataArray`."
@@ -488,7 +488,7 @@
},
{
"cell_type": "markdown",
- "id": "9f34f5d4",
+ "id": "65ae8421",
"metadata": {},
"source": [
"#### Merge the two ERA5 experiments (1 and 5, `expver = [1,5]`)\n",
@@ -501,7 +501,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "cd5eb8a2",
+ "id": "a6c23e39",
"metadata": {},
"outputs": [],
"source": [
@@ -511,7 +511,7 @@
},
{
"cell_type": "markdown",
- "id": "3b95cc14",
+ "id": "2832e39b",
"metadata": {},
"source": [
"Let's check again the `da_tmp` data array. If there was an `expver` coordinate we [reduce this dimension](https://docs.xarray.dev/en/stable/generated/xarray.DataArray.reduce.html) by performing a [`nansum`](https://numpy.org/doc/stable/reference/generated/numpy.nansum.html) operation, i.e. a sum of the array elements over this axis, treating Not a Numbers (NaNs) as zero. The result is a new `xarray.DataArray` merging the data along the `expver` dimension:"
@@ -520,7 +520,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "9109ec75",
+ "id": "b9dc47a7",
"metadata": {},
"outputs": [],
"source": [
@@ -529,7 +529,7 @@
},
{
"cell_type": "markdown",
- "id": "1ee284b5",
+ "id": "68f1b885",
"metadata": {},
"source": [
"Now the data array contains the three expected dimensions: `time`, `latitude` and `longitude`."
@@ -537,7 +537,7 @@
},
{
"cell_type": "markdown",
- "id": "53ad44a8",
+ "id": "2b1a9717",
"metadata": {},
"source": [
"#### Change temperature units from Kelvin to Celsius\n",
@@ -548,7 +548,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "8d8e1cbb",
+ "id": "cc04e236",
"metadata": {},
"outputs": [],
"source": [
@@ -561,7 +561,7 @@
},
{
"cell_type": "markdown",
- "id": "3fedd3fb",
+ "id": "ca0cd56f",
"metadata": {},
"source": [
"#### Data to be used"
@@ -569,7 +569,7 @@
},
{
"cell_type": "markdown",
- "id": "36645c48",
+ "id": "399ab10b",
"metadata": {},
"source": [
"The `da_celsius` data array will be used in the rest of the surface temperature exercise. Let's check what we have:"
@@ -578,7 +578,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "c519a979",
+ "id": "1c40ce3a",
"metadata": {},
"outputs": [],
"source": [
@@ -587,7 +587,7 @@
},
{
"cell_type": "markdown",
- "id": "12436699",
+ "id": "1e5eef03",
"metadata": {},
"source": [
"Now we can see the updated values in *Celsius* and the `units` attribute updated accordingly."
@@ -595,7 +595,7 @@
},
{
"cell_type": "markdown",
- "id": "ba44a1f5",
+ "id": "b1e3e5ac",
"metadata": {},
"source": [
"#### Plotting one timestep"
@@ -603,7 +603,7 @@
},
{
"cell_type": "markdown",
- "id": "b582dd41",
+ "id": "2c6f7104",
"metadata": {},
"source": [
"Just to check what we have so far, let's plot a map of 2m temperature for the first (July 1940) and the last (July 2023) timesteps. We will plot these maps using the convenience method `plot` available for `xarray.DataArray`. This allows the creation of simple plots using one line of code. Also, with the xarray method `sel()`, you can select a data array based on coordinate labels."
@@ -612,7 +612,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "87864f37",
+ "id": "3a4f7ebf",
"metadata": {},
"outputs": [],
"source": [
@@ -622,7 +622,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "c5946b4b",
+ "id": "cef44f45",
"metadata": {},
"outputs": [],
"source": [
@@ -631,7 +631,7 @@
},
{
"cell_type": "markdown",
- "id": "f31fc1ef",
+ "id": "0e0bfb0e",
"metadata": {},
"source": [
"## 2. Calculate a surface temperature climatology: reference period 1991-2020"
@@ -639,7 +639,7 @@
},
{
"cell_type": "markdown",
- "id": "a3cbaa85",
+ "id": "f05884d3",
"metadata": {},
"source": [
"#### Standard reference periods and climatologies\n",
@@ -654,7 +654,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "15e2eb25",
+ "id": "47c7fcba",
"metadata": {},
"outputs": [],
"source": [
@@ -663,7 +663,7 @@
},
{
"cell_type": "markdown",
- "id": "220d67bd",
+ "id": "c629b35b",
"metadata": {},
"source": [
"If we have a look at this data object we will see now we have only two coordinates, `latitude` and `longitude`."
@@ -672,7 +672,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "7e313e58",
+ "id": "afa9a7e5",
"metadata": {},
"outputs": [],
"source": [
@@ -681,7 +681,7 @@
},
{
"cell_type": "markdown",
- "id": "aec6b340",
+ "id": "bf6e191f",
"metadata": {},
"source": [
"We can also make a quick plot to have an exploratory view of this new `xarray.DataArray`:"
@@ -690,7 +690,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "900462ab",
+ "id": "c0f07ad9",
"metadata": {},
"outputs": [],
"source": [
@@ -699,7 +699,7 @@
},
{
"cell_type": "markdown",
- "id": "9d500623",
+ "id": "7307dfa4",
"metadata": {},
"source": [
"## 3. Visualise surface temperature anomalies\n",
@@ -712,7 +712,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "769fb056",
+ "id": "0f556907",
"metadata": {},
"outputs": [],
"source": [
@@ -722,7 +722,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "bc174576",
+ "id": "cca9d238",
"metadata": {},
"outputs": [],
"source": [
@@ -732,7 +732,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "6d34a81f",
+ "id": "2b329975",
"metadata": {},
"outputs": [],
"source": [
@@ -742,7 +742,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "0f226683",
+ "id": "a02b9462",
"metadata": {},
"outputs": [],
"source": [
@@ -751,7 +751,7 @@
},
{
"cell_type": "markdown",
- "id": "cd992193",
+ "id": "88731e92",
"metadata": {},
"source": [
"The anomaly will be the difference between `t2m_july2023` and `t2m_ref_per`. A positive value means July 2023 is above the expected mean:"
@@ -760,7 +760,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "86868d79",
+ "id": "459d1ec8",
"metadata": {},
"outputs": [],
"source": [
@@ -769,7 +769,7 @@
},
{
"cell_type": "markdown",
- "id": "0e57b035",
+ "id": "db98cfe8",
"metadata": {},
"source": [
"The previous operation results in the anomaly on each longitude and latitude location stored in the `anom` data array. We can plot this in a map to check where the anomaly was positive (July 2023 warmer than the climatology) or negative (July 2023 colder than the climatology). This time we will create the plot using the `matplotlib` and `cartopy` libraries.\n",
@@ -780,7 +780,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "61863839",
+ "id": "72bdf393",
"metadata": {},
"outputs": [],
"source": [
@@ -819,7 +819,7 @@
},
{
"cell_type": "markdown",
- "id": "3b3213c7",
+ "id": "5476cc38",
"metadata": {},
"source": [
"## 4. View time series and analyse surface temperature trends"
@@ -827,7 +827,7 @@
},
{
"cell_type": "markdown",
- "id": "b64c9f82",
+ "id": "a8499126",
"metadata": {},
"source": [
"Now let us view the time series from 1940 to 2023 averaged over the entire region. To do this we need to average `da_celsius` over the latitude and longitude dimensions. A very important consideration however is that the gridded data cells do not all correspond to the same areas. The size covered by each data point on the model grid varies as a function of latitude. We need to take this into account when calculating spatial averages. \n",
@@ -844,7 +844,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "56a2b4d1",
+ "id": "27abda31",
"metadata": {},
"outputs": [],
"source": [
@@ -855,7 +855,7 @@
},
{
"cell_type": "markdown",
- "id": "a8cdb891",
+ "id": "c811535d",
"metadata": {},
"source": [
"Then we calculate the weighted mean so we will have a time series with the spatially averaged July `t2m` from 1940 to 2023."
@@ -864,7 +864,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "ed7197ee",
+ "id": "7f7121e6",
"metadata": {},
"outputs": [],
"source": [
@@ -873,7 +873,7 @@
},
{
"cell_type": "markdown",
- "id": "448b1d59",
+ "id": "9ace0124",
"metadata": {},
"source": [
"Let's look at the new data array:"
@@ -882,7 +882,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "f2cd6755",
+ "id": "f8c1672d",
"metadata": {},
"outputs": [],
"source": [
@@ -891,7 +891,7 @@
},
{
"cell_type": "markdown",
- "id": "ae4315b8",
+ "id": "0fbf808c",
"metadata": {},
"source": [
"We will calculate the climatology for this global spatially averaged July `t2m`. This value will be used later to check which years have global average 2m temperature above or below the climatology."
@@ -900,7 +900,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "12551a1d",
+ "id": "a3d40c2c",
"metadata": {},
"outputs": [],
"source": [
@@ -910,7 +910,7 @@
},
{
"cell_type": "markdown",
- "id": "d2ce5442",
+ "id": "852bc241",
"metadata": {},
"source": [
"We will create a constant array with the climatology value that has the same length as the time series:"
@@ -919,7 +919,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "10a9173c",
+ "id": "5f2e307f",
"metadata": {},
"outputs": [],
"source": [
@@ -928,7 +928,7 @@
},
{
"cell_type": "markdown",
- "id": "7b263f44",
+ "id": "cad87daa",
"metadata": {},
"source": [
"Let's plot the mean value since 1940. The values below the climatology will be highlighted in light blue while the values above the climatology will be highlighted in red. Code is commented in the code cell below."
@@ -937,7 +937,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "ef498af4",
+ "id": "faf43f6a",
"metadata": {},
"outputs": [],
"source": [
@@ -950,7 +950,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "b879e5a4",
+ "id": "6480a7d9",
"metadata": {},
"outputs": [],
"source": [
@@ -997,7 +997,7 @@
},
{
"cell_type": "markdown",
- "id": "f584f57c",
+ "id": "b91a3456",
"metadata": {},
"source": [
"Could you try a similar figure but using the anomalies (*\"monthly value\" - \"1991-2020 climatological value\"*) instead of the spatially aggregated average monthly values?"
@@ -1006,7 +1006,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "af9f8782",
+ "id": "22ee9da5",
"metadata": {},
"outputs": [],
"source": [
@@ -1015,7 +1015,7 @@
},
{
"cell_type": "markdown",
- "id": "5b77860c",
+ "id": "ca6ffaef",
"metadata": {},
"source": [
"Now let's order the months from colder to warmer."
@@ -1024,7 +1024,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "819e5613",
+ "id": "a457c675",
"metadata": {},
"outputs": [],
"source": [
@@ -1033,7 +1033,7 @@
},
{
"cell_type": "markdown",
- "id": "dc997bf8",
+ "id": "da7bd289",
"metadata": {},
"source": [
"Let's have a look to the result and check if it is sorted."
@@ -1042,7 +1042,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "97493e3c",
+ "id": "518c2059",
"metadata": {},
"outputs": [],
"source": [
@@ -1051,7 +1051,7 @@
},
{
"cell_type": "markdown",
- "id": "ce41cbed",
+ "id": "f21ad29c",
"metadata": {},
"source": [
"If we plot the ranking from colder to warmer including also the climate normal we'll see the following. As before, code is commented in the code cell below:"
@@ -1060,7 +1060,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "a1fe678e",
+ "id": "c183b63f",
"metadata": {},
"outputs": [],
"source": [
@@ -1108,7 +1108,7 @@
},
{
"cell_type": "markdown",
- "id": "ee592192",
+ "id": "bd007d7f",
"metadata": {},
"source": [
"## 5. View time series and analyse North Atlantic sea surface temperature trends"
@@ -1116,12 +1116,12 @@
},
{
"cell_type": "markdown",
- "id": "0329dc4f",
+ "id": "27bd926a",
"metadata": {},
"source": [
"#### This is a new exercise. In this part of the tutorial we will be working with monthly sea surface temperature (SST) data.\n",
"\n",
- "First we need to download a new dataset. As before, we need to specify what product type, variables, temporal and geographic coverage we are interested in. These can all be selected in the **\"Download data\"** tab in the CDS ([https://cds-beta.climate.copernicus.eu](https://cds-beta.climate.copernicus.eu)). In this tab a form appears in which we select the following parameters to download:\n",
+ "First we need to download a new dataset. As before, we need to specify what product type, variables, temporal and geographic coverage we are interested in. These can all be selected in the **\"Download data\"** tab in the CDS ([https://cds.climate.copernicus.eu](https://cds.climate.copernicus.eu)). In this tab a form appears in which we select the following parameters to download:\n",
"\n",
"- Product type: `Monthly averaged reanalysis`\n",
"- Variable: `sea_surface_temperature`\n",
@@ -1137,7 +1137,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "fed64d52",
+ "id": "b68f0700",
"metadata": {},
"outputs": [],
"source": [
@@ -1178,7 +1178,7 @@
},
{
"cell_type": "markdown",
- "id": "6ab7e416",
+ "id": "caee438f",
"metadata": {},
"source": [
"Let's do some work with this new dataset. First of all, let's read it."
@@ -1187,7 +1187,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "992d6184",
+ "id": "18a8aecb",
"metadata": {},
"outputs": [],
"source": [
@@ -1196,7 +1196,7 @@
},
{
"cell_type": "markdown",
- "id": "a08d8b8d",
+ "id": "c5d50c8e",
"metadata": {},
"source": [
"Now we can have a look at the dataset:"
@@ -1205,7 +1205,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "014eeeb6",
+ "id": "7ad7c054",
"metadata": {},
"outputs": [],
"source": [
@@ -1214,7 +1214,7 @@
},
{
"cell_type": "markdown",
- "id": "b0ab7a75",
+ "id": "e8455c78",
"metadata": {},
"source": [
"As before, we see there are four dimensions and units are in *Kelvin*. We will work with data in *degrees Celsius* and we will reduce the `expver` dimension as before:"
@@ -1223,7 +1223,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "bdaa0b17",
+ "id": "3e6bd908",
"metadata": {},
"outputs": [],
"source": [
@@ -1236,7 +1236,7 @@
},
{
"cell_type": "markdown",
- "id": "3b91a7ab",
+ "id": "4a5eca18",
"metadata": {},
"source": [
"We can have a quick look at the data using the convenient `plot` method:"
@@ -1245,7 +1245,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "1b0082ff",
+ "id": "615e3724",
"metadata": {},
"outputs": [],
"source": [
@@ -1254,7 +1254,7 @@
},
{
"cell_type": "markdown",
- "id": "f1cee5e7",
+ "id": "1a40f9ed",
"metadata": {},
"source": [
"In the plot above we can see many values are below 0, those located on land. Actually, in the original `sst_ds` `xarray.Dataset` the land positions had a value of `numpy.nan`. Now, for `sst_expver` this is not true. This is a result of the previous operation using `numpy.nansum` and subtracting `273.15`. After this operation the land locations have a value of `-273.15` which is not valid. Let's amend this using a mask:"
@@ -1263,7 +1263,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "c781e076",
+ "id": "cfa4c756",
"metadata": {},
"outputs": [],
"source": [
@@ -1274,7 +1274,7 @@
},
{
"cell_type": "markdown",
- "id": "0447a66f",
+ "id": "6f36219d",
"metadata": {},
"source": [
"Again, as before, we weight the dataset by the area:"
@@ -1283,7 +1283,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "7c6d857c",
+ "id": "f8786d88",
"metadata": {},
"outputs": [],
"source": [
@@ -1295,7 +1295,7 @@
},
{
"cell_type": "markdown",
- "id": "cb0a62dc",
+ "id": "e2a0fb50",
"metadata": {},
"source": [
"And, also, we calculate the spatially averaged value for each month to get a monthly time series of the average temperature of the sst over the main area of the North Atlantic from January 1991 to July 2023:"
@@ -1304,7 +1304,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "88de69f6",
+ "id": "7856c2bb",
"metadata": {},
"outputs": [],
"source": [
@@ -1315,7 +1315,7 @@
},
{
"cell_type": "markdown",
- "id": "7b54ae7e",
+ "id": "0acd402b",
"metadata": {},
"source": [
"In the plot above we can see the monthly evolution since 1991.\n",
@@ -1326,7 +1326,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "ab4bffcb",
+ "id": "74ecdeef",
"metadata": {},
"outputs": [],
"source": [
@@ -1348,7 +1348,7 @@
},
{
"cell_type": "markdown",
- "id": "8e29cd28",
+ "id": "b67a5f90",
"metadata": {},
"source": [
"And once we have this we can compare how recent SST values compare with those of previous years and to the climatology.\n",
@@ -1364,7 +1364,7 @@
{
"cell_type": "code",
"execution_count": null,
- "id": "5e6cb3b4",
+ "id": "ecb03dd0",
"metadata": {},
"outputs": [],
"source": [
@@ -1398,7 +1398,7 @@
},
{
"cell_type": "markdown",
- "id": "5d7efd10",
+ "id": "36d850af",
"metadata": {},
"source": [
"Notice the dramatic increase in SST over the North Atlantic in 2023 compared to previous years!"
@@ -1407,7 +1407,7 @@
],
"metadata": {
"kernelspec": {
- "display_name": "venv",
+ "display_name": "Python 3",
"language": "python",
"name": "python3"
},
@@ -1421,7 +1421,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
- "version": "3.13.0"
+ "version": "3.8.5"
}
},
"nbformat": 4,
diff --git a/tutorials/01_reanalysis/01x03_reanalysis-heatwave.ipynb b/tutorials/01_reanalysis/01x03_reanalysis-heatwave.ipynb
index 6e1c529..8fbadca 100644
--- a/tutorials/01_reanalysis/01x03_reanalysis-heatwave.ipynb
+++ b/tutorials/01_reanalysis/01x03_reanalysis-heatwave.ipynb
@@ -33,8 +33,8 @@
"The library prompts us to enter our credentials, which are then securely saved in our workspace. **This request is only made the first time**; afterward, the `get_credentials` function will automatically retrieve the credentials from the environment or workspace, eliminating the need to re-enter them in the Jupyter notebook.\n",
"\n",
"To obtain your API credentials:\n",
- "1. Register or log in to the CDS at [https://cds-beta.climate.copernicus.eu](https://cds-beta.climate.copernicus.eu).\n",
- "2. Visit [https://cds-beta.climate.copernicus.eu/how-to-api](https://cds-beta.climate.copernicus.eu/how-to-api) and copy the API key provided.\n",
+ "1. Register or log in to the CDS at [https://cds.climate.copernicus.eu](https://cds.climate.copernicus.eu).\n",
+ "2. Visit [https://cds.climate.copernicus.eu/how-to-api](https://cds.climate.copernicus.eu/how-to-api) and copy the API key provided.\n",
"\n",
"The library will prompt you to enter:\n",
"- **URL**: The URL field is prefilled; simply press Enter to accept the default.\n",
@@ -171,8 +171,8 @@
"\n",
"~First, you have to define two variables: `URL` and `KEY` which build together your CDS API key.~\n",
"\n",
- "~The string of characters that make up your KEY include your personal User ID and CDS API key. To obtain these, first register or login to the CDS (https://cds-beta.climate.copernicus.eu), then visit https://cds-beta.climate.copernicus.eu/how-to-api and copy the string of characters listed after \"key:\". Replace the `#########` below with this string.~\n",
- "~URL = 'https://cds-beta.climate.copernicus.eu/api'~\n",
+ "~The string of characters that make up your KEY include your personal User ID and CDS API key. To obtain these, first register or login to the CDS (https://cds.climate.copernicus.eu), then visit https://cds.climate.copernicus.eu/how-to-api and copy the string of characters listed after \"key:\". Replace the `#########` below with this string.~\n",
+ "~URL = 'https://cds.climate.copernicus.eu/api'~\n",
"\n",
"~KEY = 'xxx'~\n",
"\n",
@@ -187,7 +187,7 @@
"source": [
"#### Search for data\n",
"\n",
- "To search for data, visit the CDS website: https://cds-beta.climate.copernicus.eu. To facilitate your search you can use keywords, or apply various filters. The data we are going to use in this exercise is the `ERA5 reanalysis data on single levels from 1979 to present`."
+ "To search for data, visit the CDS website: https://cds.climate.copernicus.eu. To facilitate your search you can use keywords, or apply various filters. The data we are going to use in this exercise is the `ERA5 reanalysis data on single levels from 1979 to present`."
]
},
{
@@ -694,7 +694,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
- "In the histogram above, you see that even if we take an increased sample covering a wider temporal range, the maximum daily temperature still never reached that of 15 September 2020. To increase the sample even further, you could include data from a longer time period. The C3S reanalysis dataset now extends back to 1940 and is accessible here [ERA5 hourly data on single levels from 1940 to present](https://cds-beta.climate.copernicus.eu/datasets?q=era5+hourly+single+levels)."
+ "In the histogram above, you see that even if we take an increased sample covering a wider temporal range, the maximum daily temperature still never reached that of 15 September 2020. To increase the sample even further, you could include data from a longer time period. The C3S reanalysis dataset now extends back to 1940 and is accessible here [ERA5 hourly data on single levels from 1940 to present](https://cds.climate.copernicus.eu/datasets?q=era5+hourly+single+levels)."
]
},
{
diff --git a/tutorials/02_observation/02x01_erb-outgoing-longwave-radiation.ipynb b/tutorials/02_observation/02x01_erb-outgoing-longwave-radiation.ipynb
index 712d937..e9cb08e 100644
--- a/tutorials/02_observation/02x01_erb-outgoing-longwave-radiation.ipynb
+++ b/tutorials/02_observation/02x01_erb-outgoing-longwave-radiation.ipynb
@@ -14,7 +14,7 @@
"### About\n",
"\n",
"This notebook-tutorial provides a practical introduction to the HIRS dataset available on \n",
- "[C3S Earth's radiation budget from 1979 to present derived from satellite observations](https://cds-beta.climate.copernicus.eu/datasets/satellite-earth-radiation-budget?tab=overview). \n",
+ "[C3S Earth's radiation budget from 1979 to present derived from satellite observations](https://cds.climate.copernicus.eu/datasets/satellite-earth-radiation-budget?tab=overview). \n",
"
\n",
"We give a short introduction to the ECV Earth Radiation Budget, Outgoing Longwave Radiation (OLR) and provide three use cases of the dataset: plot the time-averaged global distribution of OLR (Use Case 1), calculate global timeseries of OLR (Use Case 2) and plot the Arctic weighted mean timeseries between 1979 and 2019 (Use Case 3).\n",
"We provide step-by-step instructions on data preparation. Use cases come with extensive documentation and each line of code is explained. \n",
@@ -34,8 +34,9 @@
"source": [
"### TUTORIA CODE FIX\n",
"\n",
- "the code of the official tutorial is not compatible with the new format\n",
- "this notebook fixes the download code and the logic to process it"
+ "The code of the official tutorial is not compatible with the new format of the data provided by the CDS API. \n",
+ "\n",
+ "This notebook fixes the download code and the logic to process it.\n"
]
},
{
@@ -49,8 +50,8 @@
"The library prompts us to enter our credentials, which are then securely saved in our workspace. **This request is only made the first time**; afterward, the `get_credentials` function will automatically retrieve the credentials from the environment or workspace, eliminating the need to re-enter them in the Jupyter notebook.\n",
"\n",
"To obtain your API credentials:\n",
- "1. Register or log in to the CDS at [https://cds-beta.climate.copernicus.eu](https://cds-beta.climate.copernicus.eu).\n",
- "2. Visit [https://cds-beta.climate.copernicus.eu/how-to-api](https://cds-beta.climate.copernicus.eu/how-to-api) and copy the API key provided.\n",
+ "1. Register or log in to the CDS at [https://cds.climate.copernicus.eu](https://cds.climate.copernicus.eu).\n",
+ "2. Visit [https://cds.climate.copernicus.eu/how-to-api](https://cds.climate.copernicus.eu/how-to-api) and copy the API key provided.\n",
"\n",
"The library will prompt you to enter:\n",
"- **URL**: The URL field is prefilled; simply press Enter to accept the default.\n",
@@ -149,7 +150,7 @@
"
\n",
"\n",
"Please find further information about the dataset as well as the data in the Climate Data Store catalogue entry Earth's Radiation Budget, sections \"Overview\", \"Download data\" and \"Documentation\": \n",
- "- [Earth's Radiation Budget from 1979 to present derived from satellite observations](https://cds-beta.climate.copernicus.eu/datasets/satellite-earth-radiation-budget?tab=overview)\n",
+ "- [Earth's Radiation Budget from 1979 to present derived from satellite observations](https://cds.climate.copernicus.eu/datasets/satellite-earth-radiation-budget?tab=overview)\n",
"\n",
"The tutorial video describes the \"Earth Radiation Budget\" Essential Climate Variable and the methods and satellite instruments used to produce the data provided in the CDS catalogue entry: \n",
"- [Tutorial video on the Earth Radiation Budget Essential Climate Variable](https://datastore.copernicus-climate.eu/documents/satellite-earth-radiation-budget/C3S_D312b_Lot1.4.2.5_201902_Tutorial_ECVEarthRadiationBudget_v1.4.mp4)"
@@ -276,9 +277,9 @@
"\n",
"~First, you have to define two variables: `URL` and `KEY` which build together your CDS API key.~\n",
"\n",
- "~The string of characters that make up your KEY include your personal User ID and CDS API key. To obtain these, first register or login to the CDS (https://cds-beta.climate.copernicus.eu), then visit https://cds-beta.climate.copernicus.eu/how-to-api and copy the string of characters listed after \"key:\". Replace the `#########` below with this string.~\n",
+ "~The string of characters that make up your KEY include your personal User ID and CDS API key. To obtain these, first register or login to the CDS (https://cds.climate.copernicus.eu), then visit https://cds.climate.copernicus.eu/how-to-api and copy the string of characters listed after \"key:\". Replace the `#########` below with this string.~\n",
"\n",
- "~URL = 'https://cds-beta.climate.copernicus.eu/api'~\n",
+ "~URL = 'https://cds.climate.copernicus.eu/api'~\n",
"\n",
"~KEY = 'xxx'~\n",
"\n",
@@ -305,8 +306,8 @@
"source": [
"#### Search for data\n",
"\n",
- "To search for data, visit the CDS website: https://cds-beta.climate.copernicus.eu/.\n",
- "Here you can search for HIRS OLR data using the search bar. The data we need for this use case is the [Earth's Radiation Budget from 1979 to present derived from satellite observations](https://cds-beta.climate.copernicus.eu/datasets/satellite-earth-radiation-budget?tab=overview).\n",
+ "To search for data, visit the CDS website: https://cds.climate.copernicus.eu/.\n",
+ "Here you can search for HIRS OLR data using the search bar. The data we need for this use case is the [Earth's Radiation Budget from 1979 to present derived from satellite observations](https://cds.climate.copernicus.eu/datasets/satellite-earth-radiation-budget?tab=overview).\n",
"The Earth Radiation Budget (ERB) comprises the quantification of the incoming radiation from the Sun and the outgoing reflected shortwave and emitted longwave radiation. This catalogue entry comprises data from a number of sources."
]
},
@@ -828,8 +829,8 @@
"source": [
"## Get more information about Earth Radiation Budget:\n",
"\n",
- "- [Earth's radiation budget from 1979 to present derived from satellite observations](https://cds-beta.climate.copernicus.eu/datasets/satellite-earth-radiation-budget?tab=overview)\n",
- "- [Climate Data Store](https://cds-beta.climate.copernicus.eu/)"
+ "- [Earth's radiation budget from 1979 to present derived from satellite observations](https://cds.climate.copernicus.eu/datasets/satellite-earth-radiation-budget?tab=overview)\n",
+ "- [Climate Data Store](https://cds.climate.copernicus.eu/)"
]
},
{
diff --git a/tutorials/03_climate_projection/03x01_projections-cmip6.ipynb b/tutorials/03_climate_projection/03x01_projections-cmip6.ipynb
index cdfb76b..12c9b9f 100644
--- a/tutorials/03_climate_projection/03x01_projections-cmip6.ipynb
+++ b/tutorials/03_climate_projection/03x01_projections-cmip6.ipynb
@@ -20,7 +20,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
- "This notebook provides a practical introduction on how to access and process [CMIP6 global climate projections](https://cds-beta.climate.copernicus.eu/datasets/projections-cmip6?tab=overview) data available in the Climate Data Store (CDS) of the Copernicus Climate Change Service (C3S). The workflow shows how to compute and visualize the output of an ensemble of models for the annual global average temperature between 1850 to 2100. You will use the `historical` experiment for the temporal period 1850 to 2014 and the three scenarios `SSP1-2.6`, `SSP2-4.5` and `SSP5-8.5` for the period from 2015 to 2100.\n",
+ "This notebook provides a practical introduction on how to access and process [CMIP6 global climate projections](https://cds.climate.copernicus.eu/datasets/projections-cmip6?tab=overview) data available in the Climate Data Store (CDS) of the Copernicus Climate Change Service (C3S). The workflow shows how to compute and visualize the output of an ensemble of models for the annual global average temperature between 1850 to 2100. You will use the `historical` experiment for the temporal period 1850 to 2014 and the three scenarios `SSP1-2.6`, `SSP2-4.5` and `SSP5-8.5` for the period from 2015 to 2100.\n",
"\n",
"For the sake of simplicity, and to facilitate data download, the tutorial will make use of some of the coarser resolution models that have a smaller data size. It is nevertheless only a choice for this exercise and not a recommendation (since ideally all models, including those with highest resolution, should be used). Many more models are available on the CDS, and when calculating an ensemble of models, it is best practice to use as many as possible for a more reliable output. See [here](https://confluence.ecmwf.int/display/CKB/CMIP6%3A+Global+climate+projections#CMIP6:Globalclimateprojections-Models,gridsandpressurelevels) a full list of models included in the CDS-CMIP6 dataset.\n",
"\n",
@@ -54,7 +54,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
- "This notebook introduces you to [CMIP6 Global climate projections](https://cds-beta.climate.copernicus.eu/datasets/projections-cmip6?tab=overview). The datasets used in the notebook have the following specifications:\n",
+ "This notebook introduces you to [CMIP6 Global climate projections](https://cds.climate.copernicus.eu/datasets/projections-cmip6?tab=overview). The datasets used in the notebook have the following specifications:\n",
"\n",
"> **Data**: CMIP6 global climate projections of near-surface air temperature
\n",
"> **Experiments**: Historical, SSP1-2.6, SSP2-4.5, SSP5-8.5
\n",
@@ -82,8 +82,8 @@
"The library prompts us to enter our credentials, which are then securely saved in our workspace. **This request is only made the first time**; afterward, the `get_credentials` function will automatically retrieve the credentials from the environment or workspace, eliminating the need to re-enter them in the Jupyter notebook.\n",
"\n",
"To obtain your API credentials:\n",
- "1. Register or log in to the CDS at [https://cds-beta.climate.copernicus.eu](https://cds-beta.climate.copernicus.eu).\n",
- "2. Visit [https://cds-beta.climate.copernicus.eu/how-to-api](https://cds-beta.climate.copernicus.eu/how-to-api) and copy the API key provided.\n",
+ "1. Register or log in to the CDS at [https://cds.climate.copernicus.eu](https://cds.climate.copernicus.eu).\n",
+ "2. Visit [https://cds.climate.copernicus.eu/how-to-api](https://cds.climate.copernicus.eu/how-to-api) and copy the API key provided.\n",
"\n",
"The library will prompt you to enter:\n",
"- **URL**: The URL field is prefilled; simply press Enter to accept the default.\n",
@@ -240,9 +240,9 @@
"\n",
"~First, you have to define two variables: `URL` and `KEY` which build together your CDS API key.~\n",
"\n",
- "~The string of characters that make up your KEY include your personal User ID and CDS API key. To obtain these, first register or login to the CDS (https://cds-beta.climate.copernicus.eu), then visit https://cds-beta.climate.copernicus.eu/how-to-api and copy the string of characters listed after \"key:\". Replace the `#########` below with this string.~\n",
+ "~The string of characters that make up your KEY include your personal User ID and CDS API key. To obtain these, first register or login to the CDS (https://cds.climate.copernicus.eu), then visit https://cds.climate.copernicus.eu/how-to-api and copy the string of characters listed after \"key:\". Replace the `#########` below with this string.~\n",
"\n",
- "~URL = 'https://cds-beta.climate.copernicus.eu/api'~\n",
+ "~URL = 'https://cds.climate.copernicus.eu/api'~\n",
"\n",
"~KEY = 'xxx'~\n",
"\n",
@@ -299,7 +299,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
- "> **Note:** Note that these are a selection of the lightest models (in terms of data volume), to facilitate download for the sake of this exercise. There are many [more models available on the CDS](https://cds-beta.climate.copernicus.eu/datasets/projections-cmip6?tab=overview)."
+ "> **Note:** Note that these are a selection of the lightest models (in terms of data volume), to facilitate download for the sake of this exercise. There are many [more models available on the CDS](https://cds.climate.copernicus.eu/datasets/projections-cmip6?tab=overview)."
]
},
{
@@ -308,14 +308,14 @@
"source": [
"Now we can download the data for each model and experiment sequentially. We will do this separately for the historical experiments and for the various future scenarios, given that they refer to two different time periods.\n",
"\n",
- "Before you run the cells below, the terms and conditions on the use of the data need to have been accepted in the CDS. You can view and accept these conditions by logging into the [CDS](https://cds-beta.climate.copernicus.eu), searching for the dataset, then scrolling to the end of the `Download data` section."
+ "Before you run the cells below, the terms and conditions on the use of the data need to have been accepted in the CDS. You can view and accept these conditions by logging into the [CDS](https://cds.climate.copernicus.eu), searching for the dataset, then scrolling to the end of the `Download data` section."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
- "> **Note:** For more information about data access through the Climate Data Store, please see the CDS user guide [here](https://cds-beta.climate.copernicus.eu/user-guide)."
+ "> **Note:** For more information about data access through the Climate Data Store, please see the CDS user guide [here](https://cds.climate.copernicus.eu/user-guide)."
]
},
{
@@ -835,13 +835,6 @@
"data_50 = data.quantile(0.5, dim='model')"
]
},
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {},
- "outputs": [],
- "source": []
- },
{
"cell_type": "markdown",
"metadata": {},
diff --git a/tutorials/03_climate_projection/03x01_projections-cmip6_parallel.ipynb b/tutorials/03_climate_projection/03x01_projections-cmip6_parallel.ipynb
index a2fc3c6..2e65bc1 100644
--- a/tutorials/03_climate_projection/03x01_projections-cmip6_parallel.ipynb
+++ b/tutorials/03_climate_projection/03x01_projections-cmip6_parallel.ipynb
@@ -20,7 +20,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
- "This notebook provides a practical introduction on how to access and process [CMIP6 global climate projections](https://cds-beta.climate.copernicus.eu/datasets/projections-cmip6?tab=overview) data available in the Climate Data Store (CDS) of the Copernicus Climate Change Service (C3S). The workflow shows how to compute and visualize the output of an ensemble of models for the annual global average temperature between 1850 to 2100. You will use the `historical` experiment for the temporal period 1850 to 2014 and the three scenarios `SSP1-2.6`, `SSP2-4.5` and `SSP5-8.5` for the period from 2015 to 2100.\n",
+ "This notebook provides a practical introduction on how to access and process [CMIP6 global climate projections](https://cds.climate.copernicus.eu/datasets/projections-cmip6?tab=overview) data available in the Climate Data Store (CDS) of the Copernicus Climate Change Service (C3S). The workflow shows how to compute and visualize the output of an ensemble of models for the annual global average temperature between 1850 to 2100. You will use the `historical` experiment for the temporal period 1850 to 2014 and the three scenarios `SSP1-2.6`, `SSP2-4.5` and `SSP5-8.5` for the period from 2015 to 2100.\n",
"\n",
"For the sake of simplicity, and to facilitate data download, the tutorial will make use of some of the coarser resolution models that have a smaller data size. It is nevertheless only a choice for this exercise and not a recommendation (since ideally all models, including those with highest resolution, should be used). Many more models are available on the CDS, and when calculating an ensemble of models, it is best practice to use as many as possible for a more reliable output. See [here](https://confluence.ecmwf.int/display/CKB/CMIP6%3A+Global+climate+projections#CMIP6:Globalclimateprojections-Models,gridsandpressurelevels) a full list of models included in the CDS-CMIP6 dataset.\n",
"\n",
@@ -54,7 +54,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
- "This notebook introduces you to [CMIP6 Global climate projections](https://cds-beta.climate.copernicus.eu/datasets/projections-cmip6?tab=overview). The datasets used in the notebook have the following specifications:\n",
+ "This notebook introduces you to [CMIP6 Global climate projections](https://cds.climate.copernicus.eu/datasets/projections-cmip6?tab=overview). The datasets used in the notebook have the following specifications:\n",
"\n",
"> **Data**: CMIP6 global climate projections of near-surface air temperature
\n",
"> **Experiments**: Historical, SSP1-2.6, SSP2-4.5, SSP5-8.5
\n",
@@ -75,8 +75,8 @@
"The library prompts us to enter our credentials, which are then securely saved in our workspace. **This request is only made the first time**; afterward, the `get_credentials` function will automatically retrieve the credentials from the environment or workspace, eliminating the need to re-enter them in the Jupyter notebook.\n",
"\n",
"To obtain your API credentials:\n",
- "1. Register or log in to the CDS at [https://cds-beta.climate.copernicus.eu](https://cds-beta.climate.copernicus.eu).\n",
- "2. Visit [https://cds-beta.climate.copernicus.eu/how-to-api](https://cds-beta.climate.copernicus.eu/how-to-api) and copy the API key provided.\n",
+ "1. Register or log in to the CDS at [https://cds.climate.copernicus.eu](https://cds.climate.copernicus.eu).\n",
+ "2. Visit [https://cds.climate.copernicus.eu/how-to-api](https://cds.climate.copernicus.eu/how-to-api) and copy the API key provided.\n",
"\n",
"The library will prompt you to enter:\n",
"- **URL**: The URL field is prefilled; simply press Enter to accept the default.\n",
@@ -233,9 +233,9 @@
"\n",
"~First, you have to define two variables: `URL` and `KEY` which build together your CDS API key.~\n",
"\n",
- "~The string of characters that make up your KEY include your personal User ID and CDS API key. To obtain these, first register or login to the CDS (https://cds-beta.climate.copernicus.eu), then visit https://cds-beta.climate.copernicus.eu/how-to-api and copy the string of characters listed after \"key:\". Replace the `#########` below with this string.~\n",
+ "~The string of characters that make up your KEY include your personal User ID and CDS API key. To obtain these, first register or login to the CDS (https://cds.climate.copernicus.eu), then visit https://cds.climate.copernicus.eu/how-to-api and copy the string of characters listed after \"key:\". Replace the `#########` below with this string.~\n",
"\n",
- "~URL = 'https://cds-beta.climate.copernicus.eu/api'~\n",
+ "~URL = 'https://cds.climate.copernicus.eu/api'~\n",
"\n",
"~KEY = 'xxx'~\n",
"\n",
@@ -292,7 +292,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
- "> **Note:** Note that these are a selection of the lightest models (in terms of data volume), to facilitate download for the sake of this exercise. There are many [more models available on the CDS](https://cds-beta.climate.copernicus.eu/datasets/projections-cmip6?tab=overview)."
+ "> **Note:** Note that these are a selection of the lightest models (in terms of data volume), to facilitate download for the sake of this exercise. There are many [more models available on the CDS](https://cds.climate.copernicus.eu/datasets/projections-cmip6?tab=overview)."
]
},
{
@@ -301,14 +301,14 @@
"source": [
"Now we can download the data for each model and experiment sequentially. We will do this separately for the historical experiments and for the various future scenarios, given that they refer to two different time periods.\n",
"\n",
- "Before you run the cells below, the terms and conditions on the use of the data need to have been accepted in the CDS. You can view and accept these conditions by logging into the [CDS](https://cds-beta.climate.copernicus.eu), searching for the dataset, then scrolling to the end of the `Download data` section."
+ "Before you run the cells below, the terms and conditions on the use of the data need to have been accepted in the CDS. You can view and accept these conditions by logging into the [CDS](https://cds.climate.copernicus.eu), searching for the dataset, then scrolling to the end of the `Download data` section."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
- "> **Note:** For more information about data access through the Climate Data Store, please see the CDS user guide [here](https://cds-beta.climate.copernicus.eu/user-guide)."
+ "> **Note:** For more information about data access through the Climate Data Store, please see the CDS user guide [here](https://cds.climate.copernicus.eu/user-guide)."
]
},
{
@@ -1065,7 +1065,7 @@
],
"metadata": {
"kernelspec": {
- "display_name": "venv",
+ "display_name": "Python 3",
"language": "python",
"name": "python3"
},
@@ -1079,7 +1079,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
- "version": "3.13.0"
+ "version": "3.8.5"
}
},
"nbformat": 4,
diff --git a/tutorials/03_climate_projection/03x02_projections-cordex.ipynb b/tutorials/03_climate_projection/03x02_projections-cordex.ipynb
index 57409c5..818f019 100644
--- a/tutorials/03_climate_projection/03x02_projections-cordex.ipynb
+++ b/tutorials/03_climate_projection/03x02_projections-cordex.ipynb
@@ -18,7 +18,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
- "This notebook is a practical introduction to [CORDEX regional climate model data on single levels](https://cds-beta.climate.copernicus.eu/datasets/projections-cordex-domains-single-levels?tab=overview). CORDEX data are available for 14 regional domains and variable spatial resolutions, from 0.11 x 0.11 degrees up to 0.44 x 0.44 degrees. This workflow will demonstrate how to compute the difference in the air temperature climatology for 2071-2100 (according to a projected scenario) relative to the reference period 1971-2000 in Africa, with a spatial resolution of 0.44 x 0.44 degrees. \n",
+ "This notebook is a practical introduction to [CORDEX regional climate model data on single levels](https://cds.climate.copernicus.eu/datasets/projections-cordex-domains-single-levels?tab=overview). CORDEX data are available for 14 regional domains and variable spatial resolutions, from 0.11 x 0.11 degrees up to 0.44 x 0.44 degrees. This workflow will demonstrate how to compute the difference in the air temperature climatology for 2071-2100 (according to a projected scenario) relative to the reference period 1971-2000 in Africa, with a spatial resolution of 0.44 x 0.44 degrees. \n",
"\n",
"We will use the historical experiment to compute the past climatology and repeat the steps to compute the projected climatology according to the scenario RCP4.5. Finally, we will take the difference between the two in order to assess the extent of change. This is called the \"delta method\" which takes the climate change signal as the difference between the future and past considering that the model biases are the same and can therefore be removed with the subtraction.\n",
"\n",
@@ -62,7 +62,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
- "This notebook introduces you to [CORDEX regional climate model data on single levels](https://cds-beta.climate.copernicus.eu/datasets/projections-cordex-domains-single-levels?tab=overview). The data used in the notebook has the following specifications:\n",
+ "This notebook introduces you to [CORDEX regional climate model data on single levels](https://cds.climate.copernicus.eu/datasets/projections-cordex-domains-single-levels?tab=overview). The data used in the notebook has the following specifications:\n",
"\n",
"> **Data**: `CORDEX regional climate model data on single levels - Experiment: Historical`
\n",
"> **Temporal coverage**: `1 Jan 1971 to 31 Dec 2000`
\n",
@@ -95,8 +95,8 @@
"The library prompts us to enter our credentials, which are then securely saved in our workspace. **This request is only made the first time**; afterward, the `get_credentials` function will automatically retrieve the credentials from the environment or workspace, eliminating the need to re-enter them in the Jupyter notebook.\n",
"\n",
"To obtain your API credentials:\n",
- "1. Register or log in to the CDS at [https://cds-beta.climate.copernicus.eu](https://cds-beta.climate.copernicus.eu).\n",
- "2. Visit [https://cds-beta.climate.copernicus.eu/how-to-api](https://cds-beta.climate.copernicus.eu/how-to-api) and copy the API key provided.\n",
+ "1. Register or log in to the CDS at [https://cds.climate.copernicus.eu](https://cds.climate.copernicus.eu).\n",
+ "2. Visit [https://cds.climate.copernicus.eu/how-to-api](https://cds.climate.copernicus.eu/how-to-api) and copy the API key provided.\n",
"\n",
"The library will prompt you to enter:\n",
"- **URL**: The URL field is prefilled; simply press Enter to accept the default.\n",
@@ -229,9 +229,9 @@
"\n",
"~First, you have to define two variables: `URL` and `KEY` which build together your CDS API key.~\n",
"\n",
- "~The string of characters that make up your KEY include your personal User ID and CDS API key. To obtain these, first register or login to the CDS (https://cds-beta.climate.copernicus.eu), then visit https://cds-beta.climate.copernicus.eu/how-to-api and copy the string of characters listed after \"key:\". Replace the `#########` below with this string.~\n",
+ "~The string of characters that make up your KEY include your personal User ID and CDS API key. To obtain these, first register or login to the CDS (https://cds.climate.copernicus.eu), then visit https://cds.climate.copernicus.eu/how-to-api and copy the string of characters listed after \"key:\". Replace the `#########` below with this string.~\n",
"\n",
- "~URL = 'https://cds-beta.climate.copernicus.eu/api'~\n",
+ "~URL = 'https://cds.climate.copernicus.eu/api'~\n",
"\n",
"~KEY = 'xxx'~\n",
"\n",
@@ -248,7 +248,7 @@
"* **Historical experiment**: Daily aggregated historical 2m air temperature (from the CanRCM4 - CanESM2 model) from 1971 to 2000 for Africa.\n",
"* **RCP4.5 experiment**: Daily aggregated RCP4.5 projections of 2m air temperature (from the CanRCM4 - CanESM2 model) from 2071 to 2100 for Africa\n",
"\n",
- "> **Note:** Before you run the cells below, the terms and conditions on the use of the data need to have been accepted in the CDS. You can view and accept these conditions by logging into the [CDS](https://cds-beta.climate.copernicus.eu), searching for the dataset, then scrolling to the end of the *Download data* section."
+ "> **Note:** Before you run the cells below, the terms and conditions on the use of the data need to have been accepted in the CDS. You can view and accept these conditions by logging into the [CDS](https://cds.climate.copernicus.eu), searching for the dataset, then scrolling to the end of the *Download data* section."
]
},
{
@@ -631,7 +631,7 @@
],
"metadata": {
"kernelspec": {
- "display_name": "venv",
+ "display_name": "Python 3",
"language": "python",
"name": "python3"
},
@@ -645,7 +645,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
- "version": "3.13.0"
+ "version": "3.8.5"
}
},
"nbformat": 4,
diff --git a/tutorials/04_seasonal_forecast/04x01_sf-anomalies.ipynb b/tutorials/04_seasonal_forecast/04x01_sf-anomalies.ipynb
index ca16cbd..7616190 100644
--- a/tutorials/04_seasonal_forecast/04x01_sf-anomalies.ipynb
+++ b/tutorials/04_seasonal_forecast/04x01_sf-anomalies.ipynb
@@ -16,13 +16,6 @@
"# Seasonal Forecast Anomalies"
]
},
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {},
- "outputs": [],
- "source": []
- },
{
"cell_type": "markdown",
"metadata": {
@@ -94,7 +87,7 @@
"source": [
""
+ " Precomputed anomalies are also available through the CDS. Note these may be slightly different due to minor differences in the way they are computed (e.g. months of constant length, 30 days) and also due to GRIB packing discretisation. See here for more detials."
]
},
{
@@ -110,7 +103,14 @@
"tags": []
},
"source": [
- "Please see here the full documentation of the [C3S Seasonal Forecast Datasets](https://confluence.ecmwf.int/display/CKB/C3S+Seasonal+Forecasts%3A+datasets+documentation). This notebook introduces you to the [seasonal forecast monthly statistics](https://cds-beta.climate.copernicus.eu/datasets/seasonal-monthly-single-levels?tab=overview) datasets on single levels (as opposed to multiple levels in the atmosphere)."
+ "Please see here the full documentation of the [C3S Seasonal Forecast Datasets](https://confluence.ecmwf.int/display/CKB/C3S+Seasonal+Forecasts%3A+datasets+documentation). This notebook introduces you to the [seasonal forecast monthly statistics](https://cds.climate.copernicus.eu/datasets/seasonal-monthly-single-levels?tab=overview) datasets on single levels (as opposed to multiple levels in the atmosphere)."
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "---"
]
},
{
@@ -124,8 +124,8 @@
"The library prompts us to enter our credentials, which are then securely saved in our workspace. **This request is only made the first time**; afterward, the `get_credentials` function will automatically retrieve the credentials from the environment or workspace, eliminating the need to re-enter them in the Jupyter notebook.\n",
"\n",
"To obtain your API credentials:\n",
- "1. Register or log in to the CDS at [https://cds-beta.climate.copernicus.eu](https://cds-beta.climate.copernicus.eu).\n",
- "2. Visit [https://cds-beta.climate.copernicus.eu/how-to-api](https://cds-beta.climate.copernicus.eu/how-to-api) and copy the API key provided.\n",
+ "1. Register or log in to the CDS at [https://cds.climate.copernicus.eu](https://cds.climate.copernicus.eu).\n",
+ "2. Visit [https://cds.climate.copernicus.eu/how-to-api](https://cds.climate.copernicus.eu/how-to-api) and copy the API key provided.\n",
"\n",
"The library will prompt you to enter:\n",
"- **URL**: The URL field is prefilled; simply press Enter to accept the default.\n",
@@ -207,13 +207,6 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "execution": {
- "iopub.execute_input": "2022-07-04T13:06:51.461916Z",
- "iopub.status.busy": "2022-07-04T13:06:51.461438Z",
- "iopub.status.idle": "2022-07-04T13:07:06.209789Z",
- "shell.execute_reply": "2022-07-04T13:07:06.208510Z",
- "shell.execute_reply.started": "2022-07-04T13:06:51.461874Z"
- },
"papermill": {
"duration": 12.174934,
"end_time": "2022-03-14T17:49:58.941932",
@@ -233,13 +226,6 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "execution": {
- "iopub.execute_input": "2022-07-04T13:04:16.383333Z",
- "iopub.status.busy": "2022-07-04T13:04:16.382811Z",
- "iopub.status.idle": "2022-07-04T13:06:41.385440Z",
- "shell.execute_reply": "2022-07-04T13:06:41.383951Z",
- "shell.execute_reply.started": "2022-07-04T13:04:16.383238Z"
- },
"papermill": {
"duration": 127.036612,
"end_time": "2022-03-14T17:52:06.036378",
@@ -276,13 +262,6 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "execution": {
- "iopub.execute_input": "2022-07-04T14:29:57.194546Z",
- "iopub.status.busy": "2022-07-04T14:29:57.193516Z",
- "iopub.status.idle": "2022-07-04T14:29:57.204218Z",
- "shell.execute_reply": "2022-07-04T14:29:57.202881Z",
- "shell.execute_reply.started": "2022-07-04T14:29:57.194492Z"
- },
"papermill": {
"duration": 2.519057,
"end_time": "2022-03-14T17:52:09.394032",
@@ -378,9 +357,9 @@
"\n",
"~First, you have to define two variables: `URL` and `KEY` which build together your CDS API key.~\n",
"\n",
- "~The string of characters that make up your KEY include your personal User ID and CDS API key. To obtain these, first register or login to the CDS (https://cds-beta.climate.copernicus.eu), then visit https://cds-beta.climate.copernicus.eu/how-to-api and copy the string of characters listed after \"key:\". Replace the `#########` below with this string.~\n",
+ "~The string of characters that make up your KEY include your personal User ID and CDS API key. To obtain these, first register or login to the CDS (https://cds.climate.copernicus.eu), then visit https://cds.climate.copernicus.eu/how-to-api and copy the string of characters listed after \"key:\". Replace the `#########` below with this string.~\n",
"\n",
- "~URL = 'https://cds-beta.climate.copernicus.eu/api'~\n",
+ "~URL = 'https://cds.climate.copernicus.eu/api'~\n",
"\n",
"~KEY = 'xxx'~\n",
"\n",
@@ -426,7 +405,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
- "If you have not already done so, you will need to accept the **terms & conditions** of the data before you can download it. These can be viewed and accepted in the [CDS download page](https://cds-beta.climate.copernicus.eu/datasets/seasonal-monthly-single-levels?tab=download) by scrolling to the end of the download form."
+ "If you have not already done so, you will need to accept the **terms & conditions** of the data before you can download it. These can be viewed and accepted in the [CDS download page](https://cds.climate.copernicus.eu/datasets/seasonal-monthly-single-levels?tab=download) by scrolling to the end of the download form."
]
},
{
@@ -435,20 +414,13 @@
"source": [
"\n",
"
NOTE:
\n",
- " The API request below can be generated automatically from the
CDS download page. At the end of the download form there is a
Show API request
icon, which allows a copy-paste of the code below.
"
+ " The API request below can be generated automatically from the CDS download page. At the end of the download form there is a Show API request
icon, which allows a copy-paste of the code below."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
- "execution": {
- "iopub.execute_input": "2022-03-22T11:06:46.261964Z",
- "iopub.status.busy": "2022-03-22T11:06:46.26171Z",
- "iopub.status.idle": "2022-03-22T11:16:19.250634Z",
- "shell.execute_reply": "2022-03-22T11:16:19.24926Z",
- "shell.execute_reply.started": "2022-03-22T11:06:46.26194Z"
- },
"papermill": {
"duration": 307.378243,
"end_time": "2022-03-14T17:57:27.00603",
@@ -572,13 +544,6 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "execution": {
- "iopub.execute_input": "2022-07-04T15:01:45.634652Z",
- "iopub.status.busy": "2022-07-04T15:01:45.634253Z",
- "iopub.status.idle": "2022-07-04T15:01:52.107563Z",
- "shell.execute_reply": "2022-07-04T15:01:52.106228Z",
- "shell.execute_reply.started": "2022-07-04T15:01:45.634619Z"
- },
"papermill": {
"duration": 0.564456,
"end_time": "2022-03-14T17:57:30.177156",
@@ -631,13 +596,6 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "execution": {
- "iopub.execute_input": "2022-07-04T15:01:52.110136Z",
- "iopub.status.busy": "2022-07-04T15:01:52.109756Z",
- "iopub.status.idle": "2022-07-04T15:01:58.292198Z",
- "shell.execute_reply": "2022-07-04T15:01:58.290900Z",
- "shell.execute_reply.started": "2022-07-04T15:01:52.110104Z"
- },
"papermill": {
"duration": null,
"end_time": null,
@@ -675,13 +633,6 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "execution": {
- "iopub.execute_input": "2022-07-04T15:01:58.294564Z",
- "iopub.status.busy": "2022-07-04T15:01:58.294214Z",
- "iopub.status.idle": "2022-07-04T15:01:58.325219Z",
- "shell.execute_reply": "2022-07-04T15:01:58.324006Z",
- "shell.execute_reply.started": "2022-07-04T15:01:58.294532Z"
- },
"papermill": {
"duration": null,
"end_time": null,
@@ -726,13 +677,6 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "execution": {
- "iopub.execute_input": "2022-07-04T15:02:02.851493Z",
- "iopub.status.busy": "2022-07-04T15:02:02.851082Z",
- "iopub.status.idle": "2022-07-04T15:02:11.876831Z",
- "shell.execute_reply": "2022-07-04T15:02:11.875599Z",
- "shell.execute_reply.started": "2022-07-04T15:02:02.851460Z"
- },
"papermill": {
"duration": null,
"end_time": null,
@@ -804,13 +748,6 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "execution": {
- "iopub.execute_input": "2022-07-04T15:02:11.879426Z",
- "iopub.status.busy": "2022-07-04T15:02:11.879064Z",
- "iopub.status.idle": "2022-07-04T15:02:12.469303Z",
- "shell.execute_reply": "2022-07-04T15:02:12.468131Z",
- "shell.execute_reply.started": "2022-07-04T15:02:11.879392Z"
- },
"papermill": {
"duration": null,
"end_time": null,
@@ -850,13 +787,6 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "execution": {
- "iopub.execute_input": "2022-07-04T15:02:12.471473Z",
- "iopub.status.busy": "2022-07-04T15:02:12.471083Z",
- "iopub.status.idle": "2022-07-04T15:02:13.319282Z",
- "shell.execute_reply": "2022-07-04T15:02:13.318089Z",
- "shell.execute_reply.started": "2022-07-04T15:02:12.471441Z"
- },
"papermill": {
"duration": null,
"end_time": null,
@@ -898,13 +828,6 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "execution": {
- "iopub.execute_input": "2022-07-04T15:02:13.904943Z",
- "iopub.status.busy": "2022-07-04T15:02:13.904509Z",
- "iopub.status.idle": "2022-07-04T15:02:13.990838Z",
- "shell.execute_reply": "2022-07-04T15:02:13.989578Z",
- "shell.execute_reply.started": "2022-07-04T15:02:13.904905Z"
- },
"papermill": {
"duration": null,
"end_time": null,
@@ -943,13 +866,6 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "execution": {
- "iopub.execute_input": "2022-07-04T15:02:17.923983Z",
- "iopub.status.busy": "2022-07-04T15:02:17.923558Z",
- "iopub.status.idle": "2022-07-04T15:02:18.001598Z",
- "shell.execute_reply": "2022-07-04T15:02:18.000228Z",
- "shell.execute_reply.started": "2022-07-04T15:02:17.923946Z"
- },
"papermill": {
"duration": null,
"end_time": null,
@@ -986,13 +902,6 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "execution": {
- "iopub.execute_input": "2022-07-04T15:02:24.934191Z",
- "iopub.status.busy": "2022-07-04T15:02:24.933762Z",
- "iopub.status.idle": "2022-07-04T15:02:25.298687Z",
- "shell.execute_reply": "2022-07-04T15:02:25.297534Z",
- "shell.execute_reply.started": "2022-07-04T15:02:24.934154Z"
- },
"papermill": {
"duration": null,
"end_time": null,
@@ -1017,15 +926,7 @@
{
"cell_type": "code",
"execution_count": null,
- "metadata": {
- "execution": {
- "iopub.execute_input": "2022-07-04T15:02:28.879408Z",
- "iopub.status.busy": "2022-07-04T15:02:28.879024Z",
- "iopub.status.idle": "2022-07-04T15:02:28.953330Z",
- "shell.execute_reply": "2022-07-04T15:02:28.952441Z",
- "shell.execute_reply.started": "2022-07-04T15:02:28.879375Z"
- }
- },
+ "metadata": {},
"outputs": [],
"source": [
"seas5_anomalies_202105_tp.attrs['units'] = 'mm'\n",
@@ -1077,13 +978,6 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "execution": {
- "iopub.execute_input": "2022-07-04T15:02:34.553214Z",
- "iopub.status.busy": "2022-07-04T15:02:34.552462Z",
- "iopub.status.idle": "2022-07-04T15:02:34.559923Z",
- "shell.execute_reply": "2022-07-04T15:02:34.558981Z",
- "shell.execute_reply.started": "2022-07-04T15:02:34.553165Z"
- },
"papermill": {
"duration": null,
"end_time": null,
@@ -1134,13 +1028,6 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "execution": {
- "iopub.execute_input": "2022-07-04T15:02:36.520814Z",
- "iopub.status.busy": "2022-07-04T15:02:36.519570Z",
- "iopub.status.idle": "2022-07-04T15:02:36.666938Z",
- "shell.execute_reply": "2022-07-04T15:02:36.665765Z",
- "shell.execute_reply.started": "2022-07-04T15:02:36.520724Z"
- },
"papermill": {
"duration": null,
"end_time": null,
@@ -1191,15 +1078,7 @@
{
"cell_type": "code",
"execution_count": null,
- "metadata": {
- "execution": {
- "iopub.execute_input": "2022-07-04T15:02:40.258365Z",
- "iopub.status.busy": "2022-07-04T15:02:40.257941Z",
- "iopub.status.idle": "2022-07-04T15:02:40.263430Z",
- "shell.execute_reply": "2022-07-04T15:02:40.262551Z",
- "shell.execute_reply.started": "2022-07-04T15:02:40.258326Z"
- }
- },
+ "metadata": {},
"outputs": [],
"source": [
"# Select a leadtime to visualise\n",
@@ -1220,15 +1099,7 @@
{
"cell_type": "code",
"execution_count": null,
- "metadata": {
- "execution": {
- "iopub.execute_input": "2022-07-04T16:04:37.766742Z",
- "iopub.status.busy": "2022-07-04T16:04:37.766323Z",
- "iopub.status.idle": "2022-07-04T16:04:42.200774Z",
- "shell.execute_reply": "2022-07-04T16:04:42.199004Z",
- "shell.execute_reply.started": "2022-07-04T16:04:37.766709Z"
- }
- },
+ "metadata": {},
"outputs": [],
"source": [
"# Define figure and spacing between subplots\n",
@@ -1343,13 +1214,6 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "execution": {
- "iopub.execute_input": "2022-07-04T15:03:02.382605Z",
- "iopub.status.busy": "2022-07-04T15:03:02.382186Z",
- "iopub.status.idle": "2022-07-04T15:03:02.439356Z",
- "shell.execute_reply": "2022-07-04T15:03:02.438037Z",
- "shell.execute_reply.started": "2022-07-04T15:03:02.382572Z"
- },
"papermill": {
"duration": null,
"end_time": null,
@@ -1394,13 +1258,6 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "execution": {
- "iopub.execute_input": "2022-07-04T15:37:39.015946Z",
- "iopub.status.busy": "2022-07-04T15:37:39.015469Z",
- "iopub.status.idle": "2022-07-04T15:37:39.041040Z",
- "shell.execute_reply": "2022-07-04T15:37:39.039762Z",
- "shell.execute_reply.started": "2022-07-04T15:37:39.015908Z"
- },
"papermill": {
"duration": null,
"end_time": null,
@@ -1440,13 +1297,6 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "execution": {
- "iopub.execute_input": "2022-07-04T15:37:43.456264Z",
- "iopub.status.busy": "2022-07-04T15:37:43.455882Z",
- "iopub.status.idle": "2022-07-04T15:37:43.500314Z",
- "shell.execute_reply": "2022-07-04T15:37:43.499244Z",
- "shell.execute_reply.started": "2022-07-04T15:37:43.456232Z"
- },
"papermill": {
"duration": null,
"end_time": null,
@@ -1474,15 +1324,7 @@
{
"cell_type": "code",
"execution_count": null,
- "metadata": {
- "execution": {
- "iopub.execute_input": "2022-07-04T15:57:11.263193Z",
- "iopub.status.busy": "2022-07-04T15:57:11.262732Z",
- "iopub.status.idle": "2022-07-04T15:57:11.306857Z",
- "shell.execute_reply": "2022-07-04T15:57:11.305730Z",
- "shell.execute_reply.started": "2022-07-04T15:57:11.263156Z"
- }
- },
+ "metadata": {},
"outputs": [],
"source": [
"anoms_SAsia_m_yr = anoms_SAsia_df.reset_index()\n",
@@ -1513,13 +1355,6 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "execution": {
- "iopub.execute_input": "2022-07-04T13:29:45.428774Z",
- "iopub.status.busy": "2022-07-04T13:29:45.428299Z",
- "iopub.status.idle": "2022-07-04T13:29:46.548429Z",
- "shell.execute_reply": "2022-07-04T13:29:46.547108Z",
- "shell.execute_reply.started": "2022-07-04T13:29:45.428723Z"
- },
"papermill": {
"duration": null,
"end_time": null,
@@ -1554,13 +1389,6 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "execution": {
- "iopub.execute_input": "2022-07-04T13:29:48.408037Z",
- "iopub.status.busy": "2022-07-04T13:29:48.407594Z",
- "iopub.status.idle": "2022-07-04T13:29:48.441601Z",
- "shell.execute_reply": "2022-07-04T13:29:48.440283Z",
- "shell.execute_reply.started": "2022-07-04T13:29:48.407999Z"
- },
"papermill": {
"duration": null,
"end_time": null,
@@ -1602,13 +1430,6 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "execution": {
- "iopub.execute_input": "2022-07-04T13:29:50.500927Z",
- "iopub.status.busy": "2022-07-04T13:29:50.500493Z",
- "iopub.status.idle": "2022-07-04T13:29:50.508674Z",
- "shell.execute_reply": "2022-07-04T13:29:50.507912Z",
- "shell.execute_reply.started": "2022-07-04T13:29:50.500890Z"
- },
"papermill": {
"duration": null,
"end_time": null,
@@ -1643,13 +1464,6 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "execution": {
- "iopub.execute_input": "2022-07-04T13:30:36.424840Z",
- "iopub.status.busy": "2022-07-04T13:30:36.424422Z",
- "iopub.status.idle": "2022-07-04T13:30:36.437015Z",
- "shell.execute_reply": "2022-07-04T13:30:36.435639Z",
- "shell.execute_reply.started": "2022-07-04T13:30:36.424791Z"
- },
"papermill": {
"duration": null,
"end_time": null,
@@ -1670,13 +1484,6 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "execution": {
- "iopub.execute_input": "2022-07-04T13:30:39.004874Z",
- "iopub.status.busy": "2022-07-04T13:30:39.004468Z",
- "iopub.status.idle": "2022-07-04T13:30:39.011375Z",
- "shell.execute_reply": "2022-07-04T13:30:39.010352Z",
- "shell.execute_reply.started": "2022-07-04T13:30:39.004831Z"
- },
"papermill": {
"duration": null,
"end_time": null,
@@ -1713,13 +1520,6 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "execution": {
- "iopub.execute_input": "2022-07-04T13:30:53.511392Z",
- "iopub.status.busy": "2022-07-04T13:30:53.510963Z",
- "iopub.status.idle": "2022-07-04T13:30:53.524079Z",
- "shell.execute_reply": "2022-07-04T13:30:53.522843Z",
- "shell.execute_reply.started": "2022-07-04T13:30:53.511354Z"
- },
"papermill": {
"duration": null,
"end_time": null,
@@ -1767,13 +1567,6 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "execution": {
- "iopub.execute_input": "2022-07-04T15:58:02.020933Z",
- "iopub.status.busy": "2022-07-04T15:58:02.020124Z",
- "iopub.status.idle": "2022-07-04T15:58:02.578521Z",
- "shell.execute_reply": "2022-07-04T15:58:02.576732Z",
- "shell.execute_reply.started": "2022-07-04T15:58:02.020889Z"
- },
"papermill": {
"duration": null,
"end_time": null,
@@ -1868,13 +1661,6 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "execution": {
- "iopub.execute_input": "2022-07-04T13:51:06.550875Z",
- "iopub.status.busy": "2022-07-04T13:51:06.550436Z",
- "iopub.status.idle": "2022-07-04T13:51:17.316648Z",
- "shell.execute_reply": "2022-07-04T13:51:17.315434Z",
- "shell.execute_reply.started": "2022-07-04T13:51:06.550841Z"
- },
"papermill": {
"duration": null,
"end_time": null,
@@ -1912,13 +1698,6 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "execution": {
- "iopub.execute_input": "2022-07-04T13:51:17.319738Z",
- "iopub.status.busy": "2022-07-04T13:51:17.319231Z",
- "iopub.status.idle": "2022-07-04T13:51:18.943394Z",
- "shell.execute_reply": "2022-07-04T13:51:18.942101Z",
- "shell.execute_reply.started": "2022-07-04T13:51:17.319691Z"
- },
"papermill": {
"duration": null,
"end_time": null,
@@ -1956,13 +1735,6 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "execution": {
- "iopub.execute_input": "2022-07-04T13:51:18.945063Z",
- "iopub.status.busy": "2022-07-04T13:51:18.944720Z",
- "iopub.status.idle": "2022-07-04T13:51:19.084443Z",
- "shell.execute_reply": "2022-07-04T13:51:19.083329Z",
- "shell.execute_reply.started": "2022-07-04T13:51:18.945027Z"
- },
"papermill": {
"duration": null,
"end_time": null,
@@ -2003,13 +1775,6 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
- "execution": {
- "iopub.execute_input": "2022-07-04T13:51:19.087119Z",
- "iopub.status.busy": "2022-07-04T13:51:19.086740Z",
- "iopub.status.idle": "2022-07-04T13:51:19.113561Z",
- "shell.execute_reply": "2022-07-04T13:51:19.112496Z",
- "shell.execute_reply.started": "2022-07-04T13:51:19.087087Z"
- },
"papermill": {
"duration": null,
"end_time": null,
@@ -2061,15 +1826,7 @@
{
"cell_type": "code",
"execution_count": null,
- "metadata": {
- "execution": {
- "iopub.execute_input": "2022-07-04T13:51:19.116747Z",
- "iopub.status.busy": "2022-07-04T13:51:19.115259Z",
- "iopub.status.idle": "2022-07-04T13:51:19.124562Z",
- "shell.execute_reply": "2022-07-04T13:51:19.123416Z",
- "shell.execute_reply.started": "2022-07-04T13:51:19.116692Z"
- }
- },
+ "metadata": {},
"outputs": [],
"source": [
"vts_names"
@@ -2078,15 +1835,7 @@
{
"cell_type": "code",
"execution_count": null,
- "metadata": {
- "execution": {
- "iopub.execute_input": "2022-07-04T13:51:19.126281Z",
- "iopub.status.busy": "2022-07-04T13:51:19.125952Z",
- "iopub.status.idle": "2022-07-04T13:51:19.138576Z",
- "shell.execute_reply": "2022-07-04T13:51:19.137614Z",
- "shell.execute_reply.started": "2022-07-04T13:51:19.126251Z"
- }
- },
+ "metadata": {},
"outputs": [],
"source": [
"dropdown_opts = [(vts_names[mm-1],mm) for mm in range(3,7)]"
@@ -2095,15 +1844,7 @@
{
"cell_type": "code",
"execution_count": null,
- "metadata": {
- "execution": {
- "iopub.execute_input": "2022-07-04T13:51:19.141177Z",
- "iopub.status.busy": "2022-07-04T13:51:19.140235Z",
- "iopub.status.idle": "2022-07-04T13:51:26.043478Z",
- "shell.execute_reply": "2022-07-04T13:51:26.042403Z",
- "shell.execute_reply.started": "2022-07-04T13:51:19.141138Z"
- }
- },
+ "metadata": {},
"outputs": [],
"source": [
"tp_colors = [(153/255.,51/255.,0),(204/255.,136/255.,0),(1,213/255.,0),\n",
@@ -2177,7 +1918,7 @@
],
"metadata": {
"kernelspec": {
- "display_name": "venv",
+ "display_name": "Python 3",
"language": "python",
"name": "python3"
},
@@ -2191,7 +1932,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
- "version": "3.13.0"
+ "version": "3.8.5"
}
},
"nbformat": 4,
diff --git a/tutorials/04_seasonal_forecast/04x02_sf-verification.ipynb b/tutorials/04_seasonal_forecast/04x02_sf-verification.ipynb
index 032b7b2..c065efe 100644
--- a/tutorials/04_seasonal_forecast/04x02_sf-verification.ipynb
+++ b/tutorials/04_seasonal_forecast/04x02_sf-verification.ipynb
@@ -52,7 +52,7 @@
"id": "e82da11c",
"metadata": {},
"source": [
- "Please see here the full documentation of the [C3S Seasonal Forecast Datasets](https://confluence.ecmwf.int/display/CKB/C3S+Seasonal+Forecasts%3A+datasets+documentation). This notebook will use data from the CDS dataset [seasonal forecast monthly statistics on single levels](https://cds-beta.climate.copernicus.eu/datasets/seasonal-monthly-single-levels?tab=overview) (as opposed to multiple levels in the atmosphere)."
+ "Please see here the full documentation of the [C3S Seasonal Forecast Datasets](https://confluence.ecmwf.int/display/CKB/C3S+Seasonal+Forecasts%3A+datasets+documentation). This notebook will use data from the CDS dataset [seasonal forecast monthly statistics on single levels](https://cds.climate.copernicus.eu/datasets/seasonal-monthly-single-levels?tab=overview) (as opposed to multiple levels in the atmosphere)."
]
},
{
@@ -213,7 +213,7 @@
"id": "8cb1220b",
"metadata": {},
"source": [
- "The first step is to request data from the Climate Data Store programmatically with the help of the CDS API. Let us make use of the option to manually set the CDS API credentials. First, you have to define two variables: `CDSAPI_URL` and `CDSAPI_KEY` which build together your CDS API key. Below, you have to replace the `#########` with your personal CDS key. Please find [here](https://cds-beta.climate.copernicus.eu/how-to-api) your personal CDS key."
+ "The first step is to request data from the Climate Data Store programmatically with the help of the CDS API. Let us make use of the option to manually set the CDS API credentials. First, you have to define two variables: `CDSAPI_URL` and `CDSAPI_KEY` which build together your CDS API key. Below, you have to replace the `#########` with your personal CDS key. Please find [here](https://cds.climate.copernicus.eu/how-to-api) your personal CDS key."
]
},
{
@@ -225,7 +225,7 @@
},
"outputs": [],
"source": [
- "CDSAPI_URL = 'https://cds-beta.climate.copernicus.eu/api'\n",
+ "CDSAPI_URL = 'https://cds.climate.copernicus.eu/api'\n",
"CDSAPI_KEY = '########################################'\n",
"\n",
"c = cdsapi.Client(url=CDSAPI_URL, key=CDSAPI_KEY)"
@@ -274,7 +274,7 @@
"id": "a657d6c8",
"metadata": {},
"source": [
- "If you have not already done so, you will need to accept the **terms & conditions** of the data before you can download it. These can be viewed and accepted in the [CDS download page](https://cds-beta.climate.copernicus.eu/datasets/seasonal-monthly-single-levels?tab=download) by scrolling to the end of the download form."
+ "If you have not already done so, you will need to accept the **terms & conditions** of the data before you can download it. These can be viewed and accepted in the [CDS download page](https://cds.climate.copernicus.eu/datasets/seasonal-monthly-single-levels?tab=download) by scrolling to the end of the download form."
]
},
{
@@ -284,7 +284,7 @@
"source": [
"\n",
"
NOTE:
\n",
- " An API request can be generated automatically from the
CDS download page. At the end of the download form there is a
Show API request
icon, which allows to copy-paste a snippet of code equivalent to the one used below.
"
+ " An API request can be generated automatically from the CDS download page. At the end of the download form there is a Show API request
icon, which allows to copy-paste a snippet of code equivalent to the one used below."
]
},
{
@@ -345,7 +345,7 @@
"tags": []
},
"source": [
- "Now we will request from the CDS the observation data that will be used as the ground truth against which the hindcast data will be compared. In this notebook we will be using as observational reference the CDS dataset `reanalysis-era5-single-levels-monthly-means` which contains [ERA5](https://cds-beta.climate.copernicus.eu/datasets/reanalysis-era5-single-levels-monthly-means?tab=overview) monthly averaged data from the CDS. In order to compare it with the hindcast data we have just downloaded above, we will ask the CDS to regrid it to the same grid used by the C3S seasonal forecasts.\n",
+ "Now we will request from the CDS the observation data that will be used as the ground truth against which the hindcast data will be compared. In this notebook we will be using as observational reference the CDS dataset `reanalysis-era5-single-levels-monthly-means` which contains [ERA5](https://cds.climate.copernicus.eu/datasets/reanalysis-era5-single-levels-monthly-means?tab=overview) monthly averaged data from the CDS. In order to compare it with the hindcast data we have just downloaded above, we will ask the CDS to regrid it to the same grid used by the C3S seasonal forecasts.\n",
"\n",
"Running the code block below will download the ERA5 data from the CDS as specified by the following API keywords:\n",
"\n",
diff --git a/tutorials/04_seasonal_forecast/04x02_sf-verification_NOT_WORKING.ipynb b/tutorials/04_seasonal_forecast/04x02_sf-verification_NOT_WORKING.ipynb
index 450a792..ecccd6d 100644
--- a/tutorials/04_seasonal_forecast/04x02_sf-verification_NOT_WORKING.ipynb
+++ b/tutorials/04_seasonal_forecast/04x02_sf-verification_NOT_WORKING.ipynb
@@ -52,7 +52,7 @@
"id": "587b347d",
"metadata": {},
"source": [
- "Please see here the full documentation of the [C3S Seasonal Forecast Datasets](https://confluence.ecmwf.int/display/CKB/C3S+Seasonal+Forecasts%3A+datasets+documentation). This notebook will use data from the CDS dataset [seasonal forecast monthly statistics on single levels](https://cds-beta.climate.copernicus.eu/datasets/seasonal-monthly-single-levels?tab=overview) (as opposed to multiple levels in the atmosphere)."
+ "Please see here the full documentation of the [C3S Seasonal Forecast Datasets](https://confluence.ecmwf.int/display/CKB/C3S+Seasonal+Forecasts%3A+datasets+documentation). This notebook will use data from the CDS dataset [seasonal forecast monthly statistics on single levels](https://cds.climate.copernicus.eu/datasets/seasonal-monthly-single-levels?tab=overview) (as opposed to multiple levels in the atmosphere)."
]
},
{
@@ -231,7 +231,7 @@
"id": "eb7293de",
"metadata": {},
"source": [
- "The first step is to request data from the Climate Data Store programmatically with the help of the CDS API. Let us make use of the option to manually set the CDS API credentials. First, you have to define two variables: `CDSAPI_URL` and `CDSAPI_KEY` which build together your CDS API key. Below, you have to replace the `#########` with your personal CDS key. Please find [here](https://cds-beta.climate.copernicus.eu/how-to-api) your personal CDS key."
+ "The first step is to request data from the Climate Data Store programmatically with the help of the CDS API. Let us make use of the option to manually set the CDS API credentials. First, you have to define two variables: `CDSAPI_URL` and `CDSAPI_KEY` which build together your CDS API key. Below, you have to replace the `#########` with your personal CDS key. Please find [here](https://cds.climate.copernicus.eu/how-to-api) your personal CDS key."
]
},
{
@@ -243,7 +243,7 @@
},
"outputs": [],
"source": [
- "CDSAPI_URL = 'https://cds-beta.climate.copernicus.eu/api'\n",
+ "CDSAPI_URL = 'https://cds.climate.copernicus.eu/api'\n",
"CDSAPI_KEY = '########################################'\n",
"\n",
"c = cdsapi.Client(url=CDSAPI_URL, key=CDSAPI_KEY)"
@@ -292,7 +292,7 @@
"id": "0f4cc59e",
"metadata": {},
"source": [
- "If you have not already done so, you will need to accept the **terms & conditions** of the data before you can download it. These can be viewed and accepted in the [CDS download page](https://cds-beta.climate.copernicus.eu/datasets/seasonal-monthly-single-levels?tab=download) by scrolling to the end of the download form."
+ "If you have not already done so, you will need to accept the **terms & conditions** of the data before you can download it. These can be viewed and accepted in the [CDS download page](https://cds.climate.copernicus.eu/datasets/seasonal-monthly-single-levels?tab=download) by scrolling to the end of the download form."
]
},
{
@@ -302,7 +302,7 @@
"source": [
"\n",
"
NOTE:
\n",
- " An API request can be generated automatically from the
CDS download page. At the end of the download form there is a
Show API request
icon, which allows to copy-paste a snippet of code equivalent to the one used below.
"
+ " An API request can be generated automatically from the CDS download page. At the end of the download form there is a Show API request
icon, which allows to copy-paste a snippet of code equivalent to the one used below."
]
},
{
@@ -363,7 +363,7 @@
"tags": []
},
"source": [
- "Now we will request from the CDS the observation data that will be used as the ground truth against which the hindcast data will be compared. In this notebook we will be using as observational reference the CDS dataset `reanalysis-era5-single-levels-monthly-means` which contains [ERA5](https://cds-beta.climate.copernicus.eu/datasets/reanalysis-era5-single-levels-monthly-means?tab=overview) monthly averaged data from the CDS. In order to compare it with the hindcast data we have just downloaded above, we will ask the CDS to regrid it to the same grid used by the C3S seasonal forecasts.\n",
+ "Now we will request from the CDS the observation data that will be used as the ground truth against which the hindcast data will be compared. In this notebook we will be using as observational reference the CDS dataset `reanalysis-era5-single-levels-monthly-means` which contains [ERA5](https://cds.climate.copernicus.eu/datasets/reanalysis-era5-single-levels-monthly-means?tab=overview) monthly averaged data from the CDS. In order to compare it with the hindcast data we have just downloaded above, we will ask the CDS to regrid it to the same grid used by the C3S seasonal forecasts.\n",
"\n",
"Running the code block below will download the ERA5 data from the CDS as specified by the following API keywords:\n",
"\n",
diff --git a/tutorials/05_climate_index/05x01_ci-windchill.ipynb b/tutorials/05_climate_index/05x01_ci-windchill.ipynb
index cba4d26..a588882 100644
--- a/tutorials/05_climate_index/05x01_ci-windchill.ipynb
+++ b/tutorials/05_climate_index/05x01_ci-windchill.ipynb
@@ -42,8 +42,8 @@
"The library prompts us to enter our credentials, which are then securely saved in our workspace. **This request is only made the first time**; afterward, the `get_credentials` function will automatically retrieve the credentials from the environment or workspace, eliminating the need to re-enter them in the Jupyter notebook.\n",
"\n",
"To obtain your API credentials:\n",
- "1. Register or log in to the CDS at [https://cds-beta.climate.copernicus.eu](https://cds-beta.climate.copernicus.eu).\n",
- "2. Visit [https://cds-beta.climate.copernicus.eu/how-to-api](https://cds-beta.climate.copernicus.eu/how-to-api) and copy the API key provided.\n",
+ "1. Register or log in to the CDS at [https://cds.climate.copernicus.eu](https://cds.climate.copernicus.eu).\n",
+ "2. Visit [https://cds.climate.copernicus.eu/how-to-api](https://cds.climate.copernicus.eu/how-to-api) and copy the API key provided.\n",
"\n",
"The library will prompt you to enter:\n",
"- **URL**: The URL field is prefilled; simply press Enter to accept the default.\n",
@@ -177,9 +177,9 @@
"\n",
"~First, you have to define two variables: `URL` and `KEY` which build together your CDS API key.~\n",
"\n",
- "~The string of characters that make up your KEY include your personal User ID and CDS API key. To obtain these, first register or login to the CDS (https://cds-beta.climate.copernicus.eu), then visit https://cds-beta.climate.copernicus.eu/how-to-api and copy the string of characters listed after \"key:\". Replace the `#########` below with this string.~\n",
+ "~The string of characters that make up your KEY include your personal User ID and CDS API key. To obtain these, first register or login to the CDS (https://cds.climate.copernicus.eu), then visit https://cds.climate.copernicus.eu/how-to-api and copy the string of characters listed after \"key:\". Replace the `#########` below with this string.~\n",
"\n",
- "~URL = 'https://cds-beta.climate.copernicus.eu/api'~\n",
+ "~URL = 'https://cds.climate.copernicus.eu/api'~\n",
"\n",
"~KEY = 'xxx'~\n",
"\n",
@@ -227,17 +227,6 @@
"> **Note:** UERRA data are stored on tapes in MARS, the ECMWF Meteorological Archival and Retrieval System. Accessing data from tapes is generally slower than accessing data directly from disk. The data requests below may take some hours to complete. For a quicker response you can replace the UERRA data request with similar download parameters for lower resolution ERA5 global reanalysis datasets."
]
},
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {},
- "outputs": [],
- "source": [
- "# DOWNLOAD FILES\n",
- "fw = f'{DATADIR}UERRA_ws10m.nc'\n",
- "ft = f'{DATADIR}UERRA_t2m.nc'"
- ]
- },
{
"cell_type": "code",
"execution_count": null,
@@ -267,7 +256,7 @@
" 'time': '06:00',\n",
" 'data_format': 'netcdf_legacy',\n",
" },\n",
- " fw)"
+ " f'{DATADIR}UERRA_ws10m.nc')"
]
},
{
@@ -299,7 +288,7 @@
" 'time': '06:00',\n",
" 'data_format': 'netcdf_legacy',\n",
" },\n",
- " ft)"
+ " f'{DATADIR}UERRA_t2m.nc')"
]
},
{
@@ -317,6 +306,9 @@
"metadata": {},
"outputs": [],
"source": [
+ "fw = f'{DATADIR}UERRA_ws10m.nc'\n",
+ "ft = f'{DATADIR}UERRA_t2m.nc'\n",
+ "\n",
"# Create Xarray Dataset\n",
"dw = xr.open_dataset(fw)\n",
"dt = xr.open_dataset(ft)"
@@ -540,7 +532,7 @@
],
"metadata": {
"kernelspec": {
- "display_name": "Python 3 (ipykernel)",
+ "display_name": "Python 3",
"language": "python",
"name": "python3"
},
@@ -554,7 +546,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
- "version": "3.10.9"
+ "version": "3.8.5"
}
},
"nbformat": 4,
diff --git a/tutorials/06_bias_correction/6x00 Download and Preprocess.ipynb b/tutorials/06_bias_correction/6x00 Download and Preprocess.ipynb
index 7950adc..a616d24 100644
--- a/tutorials/06_bias_correction/6x00 Download and Preprocess.ipynb
+++ b/tutorials/06_bias_correction/6x00 Download and Preprocess.ipynb
@@ -40,8 +40,8 @@
"The library prompts us to enter our credentials, which are then securely saved in our workspace. **This request is only made the first time**; afterward, the `get_credentials` function will automatically retrieve the credentials from the environment or workspace, eliminating the need to re-enter them in the Jupyter notebook.\n",
"\n",
"To obtain your API credentials:\n",
- "1. Register or log in to the CDS at [https://cds.climate.copernicus.eu](https://cds-beta.climate.copernicus.eu).\n",
- "2. Visit [https://cds.climate.copernicus.eu/how-to-api](https://cds-beta.climate.copernicus.eu/how-to-api) and copy the API key provided.\n",
+ "1. Register or log in to the CDS at [https://cds.climate.copernicus.eu](https://cds.climate.copernicus.eu).\n",
+ "2. Visit [https://cds.climate.copernicus.eu/how-to-api](https://cds.climate.copernicus.eu/how-to-api) and copy the API key provided.\n",
"\n",
"The library will prompt you to enter:\n",
"- **URL**: The URL field is prefilled; simply press Enter to accept the default.\n",