This is an old revision of the document!
Table of Contents
JYP's Python distribution (2024-03)
If you have questions about Python, read:
Using JYP's distribution
If you only need to use this distribution, just use the following steps and then read the What should I do now? section below.
There are more details on this page if you want to install a similar distribution yourself
- Initialize conda with:
Server | tcsh | bash |
---|---|---|
LSCE | source ~jypeter/.conda3_jyp_2024-03.csh | source ~jypeter/.conda3_jyp_2024-03.sh |
spirit[x] | source ~jypmce/.conda3_jyp.csh n/a | source ~jypmce/.conda3_jyp.sh n/a |
- Choose one of the installed environments and activate it with:
conda activate env_name
- python 2.7.x: Deprecated! If you still need to use python 2.7, please use CDAT 8.1
- python 3.x:
conda activate cdatm_py3
- Type
which python
(or thewp
alias) and make sure you get something looking like[…]/miniconda3<possibly_some_version>/envs/<some_env_name>/bin/python
- e.g.
cdatm_py3
environment at LSCE:
/home/share/unix_files/cdat/miniconda3_21-02/envs/cdatm_py3/bin/python
What should I do now?
You just want to use the installed distribution
You are ready if you have typed the conda activate
command specified in the previous section!
- You may want to add the
source
line to your shell configuration file so that you don't have to remember it and type it each time you open a new terminal- Note: it's better to type the
activate
command in a terminal only when you want to use this specific version of python, rather than having theactivate
in your shell configuration file. This may have some side effects in some cases
- Have a look at the Extra package list at the bottom of this page to see what is available in this distribution
- If you have not done it yet, read (or at least have a look at the list of sections):
You want to replicate this distribution on your own server or a desktop/laptop
Read the next sections of this web page
There are some slight differences depending on how and where you want to install this distribution:
- Only for you
This is the common case. Well, if you break something you can always remove everything and start from scratch
- For sharing a stable version with several other people
You should be careful not to break a shared installation!- Be extremely careful with packages that require downgrading other packages/dependencies (especially downgrading key packages like
numpy
andmatplotlib
). - Try to avoid packages only based on
pip
The notes at the beginning of each section will tell you if you can skip the section if you are just installing for you
What's New in JYP's 2024-03 distribution?
- This is the first distribution not tied to CDAT packages
- There are lots (and lots) of extra packages that all have their own history. You can always visit the Extra package list on this page, to get news from individual packages
Installation with Miniconda3
Installing Miniconda3
If Miniconda3 is not already installed, or if you want to start over with a brand new (or clean) version of Miniconda3, read JYP steps for installing Miniconda3.
Installing CDAT 8.2.1
Prerequisites
We first check that we have indeed access to a (mini)conda installation (the base environment we get if we have correctly installed miniconda3), and we assume that we have a write-access to the conda disk hierarchy, and a few Gb of disk space available
> conda env list | grep base base * /home/share/unix_files/cdat/miniconda3_21-02 > df -h /home/share/ Filesystem Size Used Avail Use% Mounted on prolix3:/share 917G 245G 626G 29% /home/share > du -sh /home/share/unix_files/cdat/miniconda3_21-02 577M /home/share/unix_files/cdat/miniconda3_21-02
We can also use conda env list
and remove some obsolete versions with conda remove -n obsolete_environment --all
, in order to free up some space
Python 2.7 version
Python 2.x is deprecated in CDAT, starting with CDAT 8.2.1.
If you still need to use python 2.7, please use CDAT 8.1
Python 3.8 version
It's possible that, when using the tcsh shell and trying to use the conda create -n cdat-8.2.1_py3 lots_of_parameters_with_wildcards
line, you will get the following error message:
conda: No match
Apparently the tcsh version of conda does not like wildcards too much (CSH install/search fail with '*' variable in an argument)…
In that case, just type bash
and run the installation command in a bash shell
The installation line below is based on information from the CDAT wiki page. See also Some questions and notes about 8.2.1 installation
bash-4.2$ conda config --env --add channels conda-forge bash-4.2$ conda config --set channel_priority strict bash-4.2$ cat .condarc channels: - conda-forge - defaults $ conda config --describe channel_priority $ conda create -n cdat-8.2.1_nompi_py3 -c conda-forge -c cdat/label/v8.2.1 cdat "libnetcdf=*=nompi_*" "mesalib=18.3.1" "python=3"
$ conda create -n cdat-8.2.1_py3 -c conda-forge -c cdat/label/v8.2.1 cdat "libnetcdf=*=mpi_openmpi_*" "mesalib=18.3.1" "python=3"
Resulting list of installed packages
conda list -n cdat-8.2.1_nompi_py3 > cdat_8.2.1_nompi_21-03-05.txt
- no mpi: cdat_8.2.1_nompi_21-03-05.txt
- with mpî: cdat_8.2.1_installed_packages.txt
Cloning the base CDAT environment before adding specific packages for LSCE
You can skip this section if you are installing CDAT just for one user (yourself) on a Linux machine (or a Windows 10 machine with WSL)
You will directly make changes and install packages in your main python environment
This section is about the creation of the cdatm19 environment
Notes about actually using the cdatm19 conda-based python
Notes:
- Why cloning? The initial CDAT environment is strictly the one created at PCMDI and certified by PCMDI. Rather than making changes directly in there, we keep it as it is, clone it, and make changes in the cloned environment
- Carefully working on different python environments (possibly cloned) is safer in a multi-user environment (i.e. you can have people use a specific environment, and make tests in another environment)
- Cloning a full environment uses Linux hard links and requires less disk space than making a real copy
$ conda create -n cdatm19_py3 --clone cdat-8.2.1_py3 $ cd /home/share/unix_files/cdat/miniconda3_21-02/envs $ du -sh cdat-8.2.1_py3 cdatm19_py3 2.5G cdat-8.2.1_py3 538M cdatm19_py3
Getting ready for a moving default CDAT environment
You can skip this section if you are installing CDAT just for one user (yourself) on a Linux machine (or a Windows 10 machine with WSL)
You will directly make changes and install packages in your main python environment
This step could probably be listed at the end, especially in a multi-user environment!
If there is already a cdatm_py3
link (pointing to an older environment), make sure that the new CDAT environment is stable and working correctly before updating the cdatm_py3
link
We create a cdatm_py3 symbolic link in the envs
directory, that has a stable name but can be moved to point to the latest default (and hopefully stable) CDAT environment. In that case, most users can just activate this cdatm_py3 environment and always get the latest stable version.
$ cd /home/share/unix_files/cdat/miniconda3_21-02/envs $ ln -s cdatm19_py3 cdatm_py3 conda env list # conda environments: # base * /home/share/unix_files/cdat/miniconda3_21-02 cdat-8.2.1_py3 /home/share/unix_files/cdat/miniconda3_21-02/envs/cdat-8.2.1_py3 cdatm19_py3 /home/share/unix_files/cdat/miniconda3_21-02/envs/cdatm19_py3 cdatm_py3 /home/share/unix_files/cdat/miniconda3_21-02/envs/cdatm_py3 $ ls -l /home/share/unix_files/cdat/miniconda3_21-02/envs/ drwxr-xr-x [...] cdat-8.2.1_py3/ drwxr-xr-x [...] cdatm19_py3/ lrwxrwxrwx [...] cdatm_py3 -> cdatm19_py3/
Customizing CDAT for LSCE
A quick test of cdms2 and vcs
You can use the quick test shown in To Test Your CDAT Installation, and check if you get the expected png file
You can also check the Testing vcs section of the LSCE installation of CDAT 8.1
Downloading cdms2/vcs test data
You should download the test data (174M of data…) and use it in the example scripts that you want to distribute, and scripts you write for reporting the errors you find (if any…). The downloaded data files will be available in: vcs.sample_data
$ conda activate cdatm19_py3 (cdatm19_py3) $ python -c 'import vcs; vcs.download_sample_data_files(); print("\nFinished downloading sample data to " + vcs.sample_data)' [...] Finished downloading sample data to /home/share/unix_files/cdat/miniconda3_21-02/envs/cdatm19_py3/share/cdat/sample_data (cdatm19_py3) $ du -sh /home/share/unix_files/cdat/miniconda3_21-02/envs/cdatm19_py3/share/cdat/sample_data 174M /home/share/unix_files/cdat/miniconda3_21-02/envs/cdatm19_py3/share/cdat/sample_data $ python >>> import os, vcs >>> vcs.sample_data '/home/jypeter/miniconda3_21-02/envs/cdat-8.2.1_py3/share/cdat/sample_data' >>> os.listdir(vcs.sample_data) ['BlueMarble.ppm', 'clt.nc', 'geo.1deg.ctl', 'geo.1deg.gmp', 'geo.1deg.grb', ...]
Packages that have no dependency problems
After cloning, we are ready to install some extra (e.g. not directly related to CDAT) packages that may be useful to LSCE users
- We first install together as many packages as possible that don't require other channels than conda-forge, and that don't request a significant downgrade of what is already installed
- We use the dry-run option of conda in order to check the version (theoretically the latest) and origin (theoretically conda-forge) of the packages that would be installed. Or maybe the result will be that there are too many conflicts and the installation cannot proceed with the requested packages combination
- conda may take a very long time to report that there are too many conflicts and we have started using mamba that uses different solving libraries and is much (much!) faster than conda
- if the dry-run reports too many conflicts or requests too many downgrades, we try to remove some (not really important) packages and check if it works better
- the last resort is too create a new environment with a smaller set of packages if a user really needs a conflicting package
- We later install individual extra packages with
conda install
,mamba install
orpip install
Pre-installation check with the dry-run option
/usr/bin/time mamba install -n cdatm19_nompi_py3 --dry-run -c conda-forge basemap basemap-data basemap-data-hires cartopy cmocean cmor ferret_datasets gdal gdal geopandas glances gsw iris joblib netcdf4 palettable pandas pillow pyferret rpy2 scikit-image scikit-learn seaborn seawater shapely spyder statsmodels windspharm wrf-python xarray xlsxwriter [... lots of information about what would be done followed by some information about how long it took ...] Summary: Install: 167 packages Downgrade: 3 packages Total download: 474 MB DryRunExit: Dry run. Exiting. 8.31user 0.51system 0:11.45elapsed 77%CPU
Actual installation
/usr/bin/time mamba install -n cdatm19_nompi_py3 -c conda-forge basemap basemap-data basemap-data-hires cartopy cmocean cmor ferret_datasets gdal gdal geopandas glances gsw iris joblib netcdf4 palettable pandas pillow pyferret rpy2 scikit-image scikit-learn seaborn seawater shapely spyder statsmodels windspharm wrf-python xarray xlsxwriter
Resulting list of installed packages
conda list -n cdatm19_nompi_py3 > cdatm19_nompi_py3_21-03-06.txt
- no mpi: cdatm19_nompi_py3_21-03-06.txt
Extra packages installed with pip
- dreqPy: CMIP6 Data Request Python API
conda activate cdatm19_py3
pip install dreqPy
- Update with:
pip install --upgrade dreqPy
- Get version number with:
$ drq -v dreqPy version 01.00.29 [Version 01.00.29]
ipython_ferretmagic
: more details in the Extra packages list section
Packages with no dependency problems and were added (or updated) later
- CMOR: CMOR (Climate Model Output Rewriter) is used to produce CF-compliant netCDF files
conda install -n cdatm19_py3 -c conda-forge cmor
- cartopy: a Python package designed for geospatial data processing in order to produce maps and other geospatial data analyses
conda install -n cdatm19_py3 -c conda-forge cartopy
- joblib: running Python functions as pipeline jobs
conda install -n cdatm19_py3 -c conda-forge joblib
- CliMAF: a Climate Model Assessment Framework
TODO
Add here packages that would be useful and have not been installed yet, or have some problems that prevent their installation
- wrf-python: A collection of diagnostic and interpolation routines for use with output from the Weather Research and Forecasting (WRF-ARW) Model
conda install -n cdatm15 -c conda-forge wrf-python
- glances: a cross-platform monitoring tool (similar to
top
)conda install -n cdatm15 -c conda-forge glances
Other packages
- NO such packages now!
Updating some packages
Some packages change more often than others, and can be easily updated the following way:
-
- Update with:
pip install --upgrade dreqPy
- Get version number with:
drq -v
Cleaning up things
Some packages may have files that can only be read by the person who installed CDAT and the LSCE extensions (eg pcmdi-metrics in 2.8.0 and cdp in 2.10)
We check if some of the installed files are missing read access for the group or other, and we manually change the permissions
>find /home/share/unix_files/cdat/miniconda3/envs \! -perm /g+r,o+r -ls # Everything OK!
Extra packages list
- cxx-compiler: a metapackage to obtain a C++ compiler
- f90nml: a Fortran namelist parser, generator, and editor
- flake8: your Tool For Style Guide Enforcement
- Loguru: a library which aims to bring enjoyable logging in Python
- pot: Python Optimal Transport
This open source Python library provides several solvers for optimization problems related to Optimal Transport for signal, image processing and machine learning - pybind11: a lightweight header-only library that exposes C++ types in Python and vice versa
- ruff: An extremely fast Python linter and code formatter, written in Rust
- xrft: Fourier transforms for xarray data
- zarr: a file storage format for chunked, compressed, N-dimensional arrays based on an open-source specification
: need to clean the list below, and move the packages to the ordered list further down
- netcdf4: a Python interface to the netCDF C library
- PCMDI metrics package (PMP): objectively compare results from climate models with observations using well-established statistical tests
- XlsxWriter: a Python module for creating Excel XLSX files
- Note: this is a dependency of
dreqPy
- dreqPy: CMIP6 Data Request Python API
- CMOR: CMOR (Climate Model Output Rewriter) is used to produce CF-compliant netCDF files
- shapely: a Python wrapper for GEOS for algebraic manipulation of geometry (manipulation and analysis of geometric objects in the Cartesian plane)
- windspharm: spherical harmonic wind analysis in Python
- CliMAF: a Climate Model Assessment Framework
The extra packages below are (more or less) listed in package name alphabetical order. These are the extra packages that we explicitly installed, but there are many more packages installed (e.g. cdatm19_nompi_py3_21-03-06.txt)!
Initialize the environment and type conda list
if you need an up-to-date list, including the packages' version and where they came from (conda-forge for most packages
- AutoViz: the One-Line Automatic Data Visualization Library. Automatically Visualize any dataset, any size with a single line of code
- basemap: a library for plotting 2D data on maps in Python
- cartopy: a Python package designed for geospatial data processing in order to produce maps and other geospatial data analyses
- see also
iris
- python-cdo: Python scripting interface of cdo
python-cdo
will install thecdo
package (providing thecdo
executable) as a dependency- see also Using CDO from python or ruby
- cdsapi: The Climate Data Store (CDS) Application Program Interface (API) is a service providing programmatic access to CDS data
- CDS = Copernicus Climate Data Store
- cfgrib: Python interface to map GRIB files to the Unidata's Common Data Model v4 following the CF Conventions
- see also
eccodes
andpygrib
- cftime: Python library for decoding time units and variable values in a netCDF file conforming to the Climate and Forecasting (CF) netCDF conventions
- Used with
xarray
- cf_xarray: provides an accessor (
DataArray.cf
orDataset.cf
) that allows you to interpret Climate and Forecast metadata convention attributes present onxarray
objects- Used with
xarray
- clustergram: visualization and diagnostics for cluster analysis
- cmcrameri: Crameri's Scientific colour maps
[color]
- the colormaps are also available in palettable.scientific
- cmocean: beautiful colormaps for oceanography
[color]
- dash and jupyter-dash: the original low-code framework for rapidly building data apps in Python, R, Julia, and F#
- see also
plotly
- D-Tale brings you an easy way to view & analyze Pandas data structures. It integrates seamlessly with ipython notebooks & python/ipython terminals.
- Install with:
conda install dtale -c conda-forge
- python-dtw: implementation of Dynamic Time Warping-type (DTW)
- Note: older installations provided dtw: DTW (Dynamic Time Warping) python module
- ecCodes: ecCodes is a package developed by ECMWF which provides an application programming interface and a set of tools for decoding and encoding GRIB and BUFR messages
- see also
cfgrib
andpygrib
- eigen: a C++ template library for linear algebra: matrices, vectors, numerical solvers, and related algorithms
- eofs: a Python package for EOF analysis of spatial-temporal data
- ESMValTool: Earth System Model Evaluation Tool
- should install
esmvaltool-python
in order to reduce the size/number of dependencies, after checking that it does not require downgrading important packages
- flox: fast & furious GroupBy reductions for dask.array
- See also
- pyferret and
ferret_datasets
: Ferret encapsulated in Python- ipython_ferretmagic: IPython notebook extension for ferret
- Install with:
pip install ferretmagic
- OSGeo/gdal: Geospatial Data Abstraction Library. GDAL is a translator library for raster and vector geospatial data formats
- GeoPy: a Python client for several popular geocoding web services
GeoPy
makes it easy for Python developers to locate the coordinates of addresses, cities, countries, and landmarks across the globe using third-party geocoders and other data sources.
- gensim: a Python library for topic modelling, document indexing and similarity retrieval with large corpora
- See also scikit-learn
- gsw: Python implementation of the Thermodynamic Equation Of Seawater
- see also seawater
- gridded: a single API for accessing / working with gridded model results on multiple grid types
- icclim: icclim (Index Calculation for CLIMate) is a Python library to compute climate indices
- intake: a lightweight package for finding, investigating, loading and disseminating data
- intake-esm: data cataloging utility built on top of intake, pandas, and xarray
- ipyleaflet: interactive maps in the Jupyter notebook
- ipywidgets: ipywidgets, also known as jupyter-widgets or simply widgets, are interactive HTML widgets for Jupyter notebooks and the IPython kernel
- iris: a powerful, format-agnostic, community-driven Python package for analysing and visualising Earth science data
- see also
cartopy
- install sample data with
conda install -c conda-forge iris-sample-data
- joblib: running Python functions as pipeline jobs
- jupyterlab: the next-generation web-based user interface for Project Jupyter
- python-kaleido: a cross-platform library for generating static images (e.g. png, svg, pdf, etc.) for web-based visualization libraries
- MetPy: a collection of tools in Python for reading, visualizing, and performing calculations with weather data
- mpltern: a Python plotting library based on Matplotlib specifically designed for ternary plots
- nc-time-axis: a package that enables making plots in matplotlib with axes made up of
cftime.datetime
dates with any calendar type - nccmp: compare two NetCDF files bitwise, semantically or with a user defined tolerance (absolute or relative percentage)
- can probably also be done with cdo -v diffn file1.nc file2.nc
- OpenCV: OpenCV (Open Source Computer Vision Library) is an open-source library that includes several hundreds of computer vision algorithms. See also OpenCV Python Tutorial and scikit-image
- Palettable: Color palettes for Python
[color]
- pandas: Python Data Analysis Library
- geopandas: an open source project to make working with geospatial data in python easier
- PeakUtils: utilities related to the detection of peaks on 1D data
- pillow: the friendly PIL (Python Imaging Library) fork
- plotly: Plotly's Python graphing library (sometimes referred to as plotly.py) makes interactive, publication-quality graphs
- see also
python-kaleido
anddash
- pooch: a friend to fetch your data files (makes it easy to download a file, without messing with
requests
andurllib
) - proplot: a lightweight matplotlib wrapper for making beautiful, publication-quality graphics
- Replaced by
ultraplot
- psyplot: Interactive Data Visualization from Python and GUIs
- PyGMT: a Python interface for the Generic Mapping Tools
- pygrib: high-level interface to the ECWMF ECCODES C library for reading GRIB files
- see also
eccodes
andcfgrib
- pyleoclim: a Python package designed for the analysis of paleoclimate data
- Wait till it can be installed with
conda
(Why I'm not installing Pyleoclim yet)
- requests: is an elegant and simple HTTP library for Python, built for human beings
- See also
pooch
- rasterio: access to geospatial raster data
- rioxarray:
rasterio
xarray extension
- rpy2: an interface to R running embedded in a Python process
- scikit-image: a collection of algorithms for image processing in Python
- scikit-learn: an open source machine learning library that supports supervised and unsupervised learning. It also provides various tools for model fitting, data preprocessing, model selection and evaluation, and many other utilities.
- seaborn: statistical data visualization
- seawater: Python re-write of the CSIRO seawater toolbox
- see also
gsw
- statsmodels: a Python module that provides classes and functions for the estimation of many different statistical models, as well as for conducting statistical tests, and statistical data exploration.
- streamlit: Streamlit turns data scripts into shareable web apps in minutes
- suntime: simple sunset and sunrise time calculation python library
- Warning: not available in conda, use
pip install suntime
- Sweetviz is pandas based Python library that generates beautiful, high-density visualizations to kickstart EDA (Exploratory Data Analysis) with just two lines of code.
- tensorflow-mkl: an end-to-end open source machine learning platform
- tensorflow-mkl will install the CPU-based (not GPU) version
- tqdm: make your loops show a smart progress meter
- UltraPlot: a succinct matplotlib wrapper for making beautiful, publication-quality graphics
- uxarray: provide xarray styled functionality for unstructured grid datasets following UGRID Conventions
- xarray: Xarray makes working with labelled multi-dimensional arrays in Python simple, efficient, and fun!
- See also:
flox
,xcdat
,rioxarray
, …
- xcdat: Xarray extended with Climate Data Analysis Tools
- xclim: an operational Python library for climate services, providing numerous climate-related indicator tools with an extensible framework for constructing custom climate indicators, statistical downscaling and bias adjustment of climate model simulations, as well as climate model ensemble analysis tools.
- xESMF: Universal Regridder for Geospatial Data
- xlrd: a library for reading data and formatting information from Excel files in the historical .xls format
- xoa: xarray-based ocean analysis library
xoa
is the successor of vacumm (vacumm does not support Python3)
- xskillscore: metrics for verifying forecasts
- ydata-profiling: a leading package for data profiling, that automates and standardizes the generation of detailed reports, complete with statistics and visualizations.
Removed packages
- NO removed packages!
Packages NOT supported in Python3
Environments summary
After following the steps above, we get the following environments. Use the conda env list
command (same result as conda info --envs
) to get the up-to-date list of available environments.
Use conda list -n env_name
to get a detailed list of which packages/modules are installed, or check the conda Installation history section to get more details
Environment name | Server | Summary |
---|---|---|
cdat-8.2.1_py3 | LSCE ciclad | CDAT 8.2.1 & Python 2.7 |
cdat-8.2.1_py3 | LSCE | CDAT 8.2.1 & Python 3.6 |
cdatm19_py3 or cdatm_py3 | LSCE ciclad | CDAT 8.2.1 & P 2.7 JYP version |
cdatm19_py3 or cdatm_py3 | LSCE | CDAT 8.2.1 & P 3.7 JYP version |
[ PMIP3 Wiki Home ] - [ Help! ] - [ Wiki syntax ]