User Tools

Site Tools


other:uvcdat:cdat_conda:cdat_8_2_1

This is an old revision of the document!


Python distribution based on CDAT 8.2.1 notes

Using JYP version

If you mostly want to use the python distribution based on CDAT 8.2.1 installed by JYP, just use the following steps and then read the What should I do now? section below

  • Initialize conda with:
Server tcsh bash
LSCE source ~jypeter/.conda3_21-02_jyp.csh source ~jypeter/.conda3_21-02_jyp.sh
ciclad source ~jypmce/.conda3_jyp.csh
n/a
source ~jypmce/.conda3_jyp.sh
n/a
  • Choose one of the installed environments and activate it with: conda activate env_name
    • python 2.7.x: Deprecated! If you still need to use python 2.7, please use CDAT 8.1
    • python 3.x: conda activate cdatm_py3
  • Type which python (or the wp alias) and make sure you get something looking like
    • […]/miniconda3<possibly_some_version>/envs/<some_env_name>/bin/python
    • e.g. cdatm_py3 environment at LSCE:
      /home/share/unix_files/cdat/miniconda3_21-02/envs/cdatm_py3/bin/python

What should I do now?

You just want to use the installed version

You are ready if you have typed the conda activate command specified in the previous section!

  • You may want to add the source line to your shell configuration file so that you don't have to remember it and type it each time you open a new terminal
    • Note: it's better to type the activate command in a terminal only when you want to use this specific version of python, rather than having the activate in your shell configuration file. This may have some side effects in some cases
  • Have a look at the Extra package list at the bottom of this page to see what is available in this distribution
  • If you have not done it yet, read (or at least have a look at the list of sections):

You want to replicate this installation on your own server or desktop/laptop

Read the next sections of this web page

There are some slight differences if you want to install this distribution:

  • just for you. This is the usual case. Well, if you break something you can always remove everything and start from scratch
  • for sharing a stable version with several other people

The notes at the beginning of each section will tell you if you can skip the section if you are just installing for you

What's New in CDAT 8.2.1?

Note: this particular CDAT installation at LSCE (and on other machines/servers) provides many extra (non-CDAT) packages. You can jump directly to the Extra package list at the bottom of this page, if you want to see what is available

Installation with Miniconda3

Installing Miniconda3

If Miniconda3 is not already installed, or if you want to start over with a brand new version, read JYP steps for installing Miniconda3.

Installing CDAT 8.2.1

Prerequisites

We first check that we have indeed access to a (mini)conda installation (the base environment we get if we have correctly installed miniconda3), and we assume that we have a write-access to the conda disk hierarchy, and a few Gb of disk space available

 > conda env list | grep base
base                  *  /home/share/unix_files/cdat/miniconda3_21-02

 > df -h  /home/share/
Filesystem      Size  Used Avail Use% Mounted on
prolix3:/share  917G  245G  626G  29% /home/share

 > du -sh /home/share/unix_files/cdat/miniconda3_21-02
577M    /home/share/unix_files/cdat/miniconda3_21-02

We can also use conda env list and remove some obsolete versions with conda remove -n obsolete_environment --all, in order to free up some space

Python 2.7 version

Python 2.x is deprecated in CDAT, starting with CDAT 8.2.1.

If you still need to use python 2.7, please use CDAT 8.1

Python 3.8 version

It's possible that, when using the tcsh shell and trying to use the conda create -n cdat-8.2.1_py3 lots_of_parameters_with_wildcards line, you will get the following error message:

conda: No match

Apparently the tcsh version of conda does not like wildcards too much (CSH install/search fail with '*' variable in an argument)…

In that case, just type bash and run the installation command in a bash shell

The installation line below is based on information from the CDAT wiki page. See also Some questions and notes about 8.2.1 installation

bash-4.2$ conda config --env --add channels conda-forge
bash-4.2$ conda config --set channel_priority strict
bash-4.2$ cat .condarc
channels:
  - conda-forge
  - defaults
$ conda config --describe channel_priority

$ conda create -n cdat-8.2.1_nompi_py3 -c conda-forge -c cdat/label/v8.2.1 cdat "libnetcdf=*=nompi_*" "mesalib=18.3.1" "python=3"
$ conda create -n cdat-8.2.1_py3 -c conda-forge -c cdat/label/v8.2.1 cdat "libnetcdf=*=mpi_openmpi_*" "mesalib=18.3.1" "python=3"

Resulting list of installed packages

Some extra installation notes

Cloning the base CDAT environment before adding specific packages for LSCE

You can skip this section if you are installing CDAT just for one user (yourself) on a Linux machine (or a Windows 10 machine with WSL)

You will directly make changes and install packages in your main python environment

This section is about the creation of the cdatm19 environment

Notes about actually using the cdatm19 conda-based python

Notes:

  • Why cloning? The initial CDAT environment is strictly the one created at PCMDI and certified by PCMDI. Rather than making changes directly in there, we keep it as it is, clone it, and make changes in the cloned environment
  • Carefully working on different python environments (possibly cloned) is safer in a multi-user environment (i.e. you can have people use a specific environment, and make tests in another environment)
  • Cloning a full environment uses Linux hard links and requires less disk space than making a real copy
$ conda create -n cdatm19_py3 --clone cdat-8.2.1_py3
$ cd /home/share/unix_files/cdat/miniconda3_21-02/envs
$ du -sh cdat-8.2.1_py3 cdatm19_py3
2.5G    cdat-8.2.1_py3
538M    cdatm19_py3

Getting ready for a moving default CDAT environment

You can skip this section if you are installing CDAT just for one user (yourself) on a Linux machine (or a Windows 10 machine with WSL)

You will directly make changes and install packages in your main python environment

This step could probably be listed at the end, especially in a multi-user environment!

If there is already a cdatm_py3 link (pointing to an older environment), make sure that the new CDAT environment is stable and working correctly before updating the cdatm_py3 link

We create a cdatm_py3 symbolic link in the envs directory, that has a stable name but can be moved to point to the latest default (and hopefully stable) CDAT environment. In that case, most users can just activate this cdatm_py3 environment and always get the latest stable version.

$ cd /home/share/unix_files/cdat/miniconda3_21-02/envs
$ ln -s cdatm19_py3 cdatm_py3

conda env list
# conda environments:
#
base                  *  /home/share/unix_files/cdat/miniconda3_21-02
cdat-8.2.1_py3           /home/share/unix_files/cdat/miniconda3_21-02/envs/cdat-8.2.1_py3
cdatm19_py3              /home/share/unix_files/cdat/miniconda3_21-02/envs/cdatm19_py3
cdatm_py3                /home/share/unix_files/cdat/miniconda3_21-02/envs/cdatm_py3

$ ls -l /home/share/unix_files/cdat/miniconda3_21-02/envs/
drwxr-xr-x [...] cdat-8.2.1_py3/
drwxr-xr-x [...] cdatm19_py3/
lrwxrwxrwx [...] cdatm_py3 -> cdatm19_py3/

Customizing CDAT for LSCE

A quick test of cdms2 and vcs

You can use the quick test shown in To Test Your CDAT Installation, and check if you get the expected png file

You can also check the Testing vcs section of the LSCE installation of CDAT 8.1

Downloading cdms2/vcs test data

You should download the test data (174M of data…) and use it in the example scripts that you want to distribute, and scripts you write for reporting the errors you find (if any…). The downloaded data files will be available in: vcs.sample_data

$ conda activate cdatm19_py3

(cdatm19_py3) $ python -c 'import vcs; vcs.download_sample_data_files(); print("\nFinished downloading sample data to " + vcs.sample_data)'
[...]
Finished downloading sample data to /home/share/unix_files/cdat/miniconda3_21-02/envs/cdatm19_py3/share/cdat/sample_data

(cdatm19_py3) $ du -sh /home/share/unix_files/cdat/miniconda3_21-02/envs/cdatm19_py3/share/cdat/sample_data
174M    /home/share/unix_files/cdat/miniconda3_21-02/envs/cdatm19_py3/share/cdat/sample_data

$ python
>>> import os, vcs
>>> vcs.sample_data
'/home/jypeter/miniconda3_21-02/envs/cdat-8.2.1_py3/share/cdat/sample_data'

>>> os.listdir(vcs.sample_data)
['BlueMarble.ppm', 'clt.nc', 'geo.1deg.ctl', 'geo.1deg.gmp', 'geo.1deg.grb', ...]

Packages that have no dependency problems

After cloning, we are ready to install some extra (e.g. not directly related to CDAT) packages that may be useful to LSCE users

  • We first install together as many packages as possible that don't require other channels than conda-forge, and that don't request a significant downgrade of what is already installed
    • We use the dry-run option of conda in order to check the version (theoretically the latest) and origin (theoretically conda-forge) of the packages that would be installed. Or maybe the result will be that there are too many conflicts and the installation cannot proceed with the requested packages combination
    • conda may take a very long time to report that there are too many conflicts and we have started using mamba that uses different solving libraries and is much (much!) faster than conda
    • if the dry-run reports too many conflicts or requests too many downgrades, we try to remove some (not really important) packages and check if it works better
      • the last resort is too create a new environment with a smaller set of packages if a user really needs a conflicting package
  • We later install individual extra packages with conda install, mamba install or pip install

Pre-installation check with the dry-run option

/usr/bin/time mamba install -n cdatm19_nompi_py3 --dry-run -c conda-forge basemap basemap-data basemap-data-hires cartopy cmocean cmor ferret_datasets gdal gdal geopandas glances gsw iris joblib netcdf4 palettable pandas pillow pyferret rpy2 scikit-image scikit-learn seaborn seawater shapely spyder statsmodels windspharm wrf-python xarray xlsxwriter

[... lots of information about what would be done followed by some information about how long it took ...]

  Summary:

  Install: 167 packages
  Downgrade: 3 packages

  Total download: 474 MB

DryRunExit: Dry run. Exiting.

8.31user 0.51system 0:11.45elapsed 77%CPU

Actual installation

/usr/bin/time mamba install -n cdatm19_nompi_py3 -c conda-forge basemap basemap-data basemap-data-hires cartopy cmocean cmor ferret_datasets gdal gdal geopandas glances gsw iris joblib netcdf4 palettable pandas pillow pyferret rpy2 scikit-image scikit-learn seaborn seawater shapely spyder statsmodels windspharm wrf-python xarray xlsxwriter

Resulting list of installed packages

Extra packages installed with pip

  • dreqPy: CMIP6 Data Request Python API
    • conda activate cdatm19_py3
      pip install dreqPy
    • Update with: pip install --upgrade dreqPy
      • Get version number with:
        $ drq -v
        dreqPy version 01.00.29 [Version 01.00.29]
  • ipython_ferretmagic: IPython notebook extension for ferret
    • conda activate cdatm19_py3
      pip install ferretmagic

Packages with no dependency problems and were added (or updated) later

  • CMOR: CMOR (Climate Model Output Rewriter) is used to produce CF-compliant netCDF files
    • conda install -n cdatm19_py3 -c conda-forge cmor
    • Get version number with: python -c 'from cmor import *; print( (CMOR_VERSION_MAJOR, CMOR_VERSION_MINOR, CMOR_VERSION_PATCH) )'
  • cartopy: a Python package designed for geospatial data processing in order to produce maps and other geospatial data analyses
    • conda install -n cdatm19_py3 -c conda-forge cartopy
  • joblib: running Python functions as pipeline jobs
    • conda install -n cdatm19_py3 -c conda-forge joblib

TODO

Add here packages that would be useful and have not been installed yet, or have some problems that prevent their installation

  • iris: A Python library for Meteorology and Climatology
    • conda install -n cdatxxx -c conda-forge iris
  • wrf-python: A collection of diagnostic and interpolation routines for use with output from the Weather Research and Forecasting (WRF-ARW) Model
    • conda install -n cdatm15 -c conda-forge wrf-python
  • glances: a cross-platform monitoring tool (similar to top)

Other packages

There is no warranty that the packages listed below will work correctly, because it was required to bypass the compatibility checks in order to install them…
  • NO such packages now!

Updating some packages

Some packages change more often than others, and can be easily updated the following way:

    • Update with: pip install --upgrade dreqPy
      • Get version number with: drq -v

Cleaning up things

Some packages may have files that can only be read by the person who installed CDAT and the LSCE extensions (eg pcmdi-metrics in 2.8.0 and cdp in 2.10)

We check if some of the installed files are missing read access for the group or other, and we manually change the permissions

 >find /home/share/unix_files/cdat/miniconda3/envs \! -perm /g+r,o+r -ls
# Everything OK!

Extra packages list

FIXME: need to clean the list below, and move the packages to the ordered list further down

  • netcdf4: a Python interface to the netCDF C library
  • ipython_ferretmagic: IPython notebook extension for ferret
  • PCMDI metrics package (PMP): objectively compare results from climate models with observations using well-established statistical tests
  • XlsxWriter: a Python module for creating Excel XLSX files
    • Note: this is a dependency of dreqPy
  • dreqPy: CMIP6 Data Request Python API
  • CMOR: CMOR (Climate Model Output Rewriter) is used to produce CF-compliant netCDF files
  • shapely: a Python wrapper for GEOS for algebraic manipulation of geometry (manipulation and analysis of geometric objects in the Cartesian plane)
  • windspharm: spherical harmonic wind analysis in Python
  • CliMAF: a Climate Model Assessment Framework

The extra packages below are (more or less) listed in package name alphabetical order. These are the extra packages that we explicitly installed, but there are many more packages installed (e.g. cdatm19_nompi_py3_21-03-06.txt)!

Initialize the environment and type conda list if you need an up-to-date list, including the packages' version and where they came from (conda-forge for most packages

  • basemap: a library for plotting 2D data on maps in Python
  • cartopy: a Python package designed for geospatial data processing in order to produce maps and other geospatial data analyses
    • see also iris
  • python-cdo: Python scripting interface of cdo
  • cfgrib: Python interface to map GRIB files to the Unidata's Common Data Model v4 following the CF Conventions
    • see also eccodes and pygrib
  • clustergram: visualization and diagnostics for cluster analysis
  • cmcrameri: Crameri's Scientific colour maps [color]
  • cmocean: beautiful colormaps for oceanography [color]
  • python-dtw: implementation of Dynamic Time Warping-type (DTW)
    • Note: older installations provided dtw: DTW (Dynamic Time Warping) python module
  • ecCodes: ecCodes is a package developed by ECMWF which provides an application programming interface and a set of tools for decoding and encoding GRIB and BUFR messages
    • need to install eccodes and python-eccodes (details)
    • see also cfgrib and pygrib
  • ESMValTool: Earth System Model Evaluation Tool
    • not installed yet (March 2022) because it required download of 302 Mb extra packages and a downgrade of gdal
  • pyferret and ferret_datasets: Ferret encapsulated in Python
  • OSGeo/gdal: Geospatial Data Abstraction Library. GDAL is a translator library for raster and vector geospatial data formats
  • gsw: Python implementation of the Thermodynamic Equation Of Seawater
    • see also seawater
  • intake: a lightweight package for finding, investigating, loading and disseminating data
    • intake-esm: data cataloging utility built on top of intake, pandas, and xarray
  • iris: a powerful, format-agnostic, community-driven Python package for analysing and visualising Earth science data
    • see also cartopy
    • install sample data with conda install -c conda-forge iris-sample-data
  • joblib: running Python functions as pipeline jobs
  • nccmp: compare two NetCDF files bitwise, semantically or with a user defined tolerance (absolute or relative percentage)
    • can probably also be done with cdo diff
  • OpenCV: OpenCV (Open Source Computer Vision Library) is an open-source library that includes several hundreds of computer vision algorithms. See also OpenCV Python Tutorial and scikit-image
  • Palettable: Color palettes for Python [color]
  • pandas: Python Data Analysis Library
    • geopandas: an open source project to make working with geospatial data in python easier
  • PeakUtils: utilities related to the detection of peaks on 1D data
  • pillow: the friendly PIL (Python Imaging Library) fork
  • proplot: a lightweight matplotlib wrapper for making beautiful, publication-quality graphics
  • pygrib: high-level interface to the ECWMF ECCODES C library for reading GRIB files
    • see also eccodes and cfgrib
  • pyleoclim: a Python package designed for the analysis of paleoclimate data
  • rpy2: an interface to R running embedded in a Python process
  • scikit-image: a collection of algorithms for image processing in Python
  • scikit-learn: an open source machine learning library that supports supervised and unsupervised learning. It also provides various tools for model fitting, data preprocessing, model selection and evaluation, and many other utilities.
  • seaborn: statistical data visualization
  • seawater: Python re-write of the CSIRO seawater toolbox
    • see also gsw
  • statsmodels: a Python module that provides classes and functions for the estimation of many different statistical models, as well as for conducting statistical tests, and statistical data exploration.
  • suntime: simple sunset and sunrise time calculation python library
    • Warning: not available in conda, use pip install suntime
  • tensorflow-mkl: an end-to-end open source machine learning platform
  • xcdat: Xarray Extended with Climate Data Analysis Tools
  • xoa: xarray-based ocean analysis library
    • xoa is the successor of vacumm (vacumm does not support Python3)

Removed packages

  • NO removed packages!

Packages NOT supported in Python3

  • spanlib: Spectral Analysis Library
  • vacumm: Validation, Analysis, Comparison - Utilities written in Python to validate and analyze Multi-Model outputs, and compare them to observations
    • Check the xoa replacement package

Environments summary

After following the steps above, we get the following environments. Use the conda env list command (same result as conda info --envs) to get the up-to-date list of available environments.

Use conda list -n env_name to get a detailed list of which packages/modules are installed, or check the conda Installation history section to get more details

Environment
name
Server Summary
cdat-8.2.1_py3 LSCE
ciclad
CDAT 8.2.1 & Python 2.7
cdat-8.2.1_py3 LSCE CDAT 8.2.1 & Python 3.6
cdatm19_py3
or cdatm_py3
LSCE
ciclad
CDAT 8.2.1 & P 2.7 JYP version
cdatm19_py3
or cdatm_py3
LSCE CDAT 8.2.1 & P 3.7 JYP version





[ PMIP3 Wiki Home ] - [ Help! ] - [ Wiki syntax ]

other/uvcdat/cdat_conda/cdat_8_2_1.1648198893.txt.gz · Last modified: 2022/03/25 09:01 by jypeter