From Notebook to Kay

It is possible to use jupyter notebooks on Kay with JupyterHub. JupyterHub allows you to display the notebook in your local web browser whilst the python code is executed on Kay. If you wish to use this option then go to this page.

This maybe inconvenient when there is a long execution time of hours or days. In this case you can convert the notebook to plain python. This allows you to submit a job on Kay logoff and come back later when it has finished.

There are four steps to achieve this. First you need to copy the notebook from your local machine over to Kay. Second you need to create the same python environment you had on your local machine. We do this using conda environments. Third convert the notebook form into plain python. Essentially this comments out the non-code parts of the notebook. Finally create a SLURM script that allows you to run the python code on Kay's compute nodes.

 

The steps needed are:

  1. Copy the notebook across to Kay. Here is an example of copying files using scp.
  2. Create a conda environment with the packages that are needed. Here is an example of creating a conda environment containing the GPU version of tensorflow.
  3. Convert notebook format to plain python.
  4. Create a slurm script that loads the Kay modules and the conda environment and finally runs the python script.




 

Convert into Plain Python

The notebook form cannot be run directly on Kay. It is a convenient way to view python code and to insert text between the code blocks for instructional purposes. So we need to convert the notebook into a plain python script. To do this we need two additional python packages. Again we can do this using pip, make sure you are still in the conda environment you have created.

 

pip install ipython

pip install nbconvert

 

To convert from one the other use the command below. There should now be two files: the notebook .ipynb and the plain python .py. Notice that much of the text cells have been converted into comments but the code cells remain unchanged.

 

ipython nbconvert —to script  my_notebook.ipynb


 

Create the SLURM Script

Below is an example script. We need to load the conda module and activate the conda environment inside the script (Test_Env in this case). Then it is simply a matter of executing the python script. The python version and python packages will all be part of the conda environment we created above.

If you need to use GPUs then we need to explicitly define the partition. Also you need to define the location of the cuddn library. Setting LD_LIBRARY_PATH will do this.

Also the account name will need to be changed to one that belongs to you. More information about SLURM is  here and here.


 

#!/bin/bash
#SBATCH --nodes=1
#SBATCH --time=05:20:00
#SBATCH --job-name=test
#SBATCH -A MyAccount
#SBATCH -p GpuQ


module purge
module load conda cuda/10.1.243 gcc
module list


source activate Test_Env


cd $SLURM_SUBMIT_DIR

export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/ichec/packages/conda/2/pkgs/cudnn-7.6.0-cuda10.1_0/lib


python my_notebook.py


exit 0

 

 

Things to Note

The compute nodes are not attached to the public network. This means that any command that downloads files from the web is not going to work. All the files must be downloaded beforehand onto Kay.

The compute nodes do not have any display software. Thus any plots that are generated during the running of the python script may cause the whole thing to crash. Comment out these parts of the code, or save the plots to a file before running.

 

 

 

Supported By

File Browser Reference
Department FHERIS
University of Galway
HEA Logo