Alfredo Oliviero 86a239bd8f | ||
---|---|---|
tutorials | ||
.gitignore | ||
CHANGELOG.md | ||
FUNDING.md | ||
LICENSE.md | ||
README.md | ||
config_auth_cds.ipynb | ||
requirements_tutorials.txt |
README.md
d4science_copernicus_notebooks
d4science_copernicus_notebooks is the repository of tutorial notebooks officially designed to help users work with the Copernicus Climate Data Store (CDS), adapted to be executed within the D4Science infrastructure.
The notebooks have been updated to work with the new Copernicus data format, integrated with the d4science_copernicus_cds library, and adapted to the D4Science environment.
For more information, visit the Copernicus Training C3S.
Version
v.1.0.0
Installation
Locally clone the repository and copy them in your JupyterLab instance.
Built With
- Copernicus CDSAPI - the Copernicus Climate Data Store (CDS) Application Program Interface (API) client
- python
- d4science
- d4science_copernicus_cds
Documentation
The original tutorial notebooks are available on Copernicus Training C3S.
These notebooks have been fixed (the official ones do not work with the new Copernicus data format), integrated with the d4science_copernicus_cds library, and adapted to the D4Science infrastructure.
Testing in D4Science JupyterLab
To test the notebooks in the D4Science JupyterLab environment, follow these steps:
-
Access D4Science JupyterLab
- Log in to the D4Science portal with your credentials.
- Navigate to the JupyterLab section.
- If existing, select the specific VM for Copernicus to have the dependencies pre-installed.
-
Upload Notebooks
- Upload the tutorial notebooks to your JupyterLab workspace.
-
Install Required Libraries
- If not using the specific VM, open a terminal within JupyterLab.
- Install the required libraries by running:
pip install -r requirements_tutorial.txt
-
Configure CDS API Key
- Open and run the
config_auth_cds.ipynb
notebook. - Follow the instructions to configure your CDS API key.
- Open and run the
-
Run the Notebooks
- Open the tutorial notebooks in JupyterLab.
- Execute the cells to run the tutorials.
Alternatively, you can execute the following command inside a notebook to install the required libraries:
!pip install -r requirements_tutorial.txt
Testing Locally on Visual Studio Code
To test the notebooks locally on Visual Studio Code, follow these steps:
-
Install Visual Studio Code
- Download and install Visual Studio Code from here.
-
Install Necessary Extensions for Notebooks
- Open Visual Studio Code.
- Go to the Extensions view by clicking on the Extensions icon in the Activity Bar on the side of the window or by pressing
Ctrl+Shift+X
. - Search for and install the following extensions:
- Python
- Jupyter
-
Create and Activate a Virtual Environment
- Open a terminal in Visual Studio Code by selecting
Terminal
>New Terminal
from the top menu. - Create a virtual environment by running:
python -m venv venv
- Activate the virtual environment:
- On Windows:
.\venv\Scripts\activate
- On macOS and Linux:
source venv/bin/activate
- On Windows:
- Open a terminal in Visual Studio Code by selecting
-
Install Requirements
- Install the required packages for tutorials by running:
pip install -r requirements_tutorial.txt
- Install the required packages for tutorials by running:
-
Register on Copernicus Climate Data Store
- Go to Copernicus Climate Data Store.
- Register for an account and create an API key.
-
Open and Run
config_auth_cds.ipynb
- In Visual Studio Code, open and exec the config_auth_cds.ipynb notebook.
- Follow the instructions in the notebook to configure your CDS API key.
-
Run the Tutorial Notebooks
- Open the tutorial notebooks in Visual Studio Code.
- Run the cells in each notebook to execute the tutorials.
Change log
See CHANGELOG.md
Authors
- Alfredo Oliviero (ORCID) - ISTI-CNR Infrascience Group
Maintainers
- Alfredo Oliviero (ORCID) - ISTI-CNR Infrascience Group
License
This project is licensed under the EUPL V.1.1 License - see the LICENSE.md file for details.
About the gCube Framework
This software is part of the gCubeFramework: an open-source software toolkit used for building and operating Hybrid Data Infrastructures enabling the dynamic deployment of Virtual Research Environments by favouring the realisation of reuse oriented policies.
The projects leading to this software have received funding from a series of European Union programmes see FUNDING.md