S R V X 1
Teaching Hub and Access Point from the web
Getting started
Steps:
- Request access / will be done for you by your supervisor.
- As Staff, access using SSH - How to SSH / VNC / VPN
- As Student, access using Teaching Hub - How to connect using the TeachingHub
Name |
Value |
Product |
PowerEdge R940 |
Processor |
Intel(R) Xeon(R) Gold 6148 CPU @ 2.40GHz |
Cores |
4 CPU, 20 physical cores per CPU, total 160 logical CPU units |
CPU time |
700 kh |
Memory |
754 GB Total |
Memory/Core |
9.4 GB |
Mountain Greeter |
---|
| ----------------------------------------------
131.130.157.11 _ . , . .
* / \_ * / \_ _ * * /\'_
/ \ / \, (( . _/ /
. /\/\ /\/ :' __ \_ ` _^/ ^/
/ \/ \ _/ \-'\ * /.' ^_ \
/\ .- `. \/ \ /==~=-=~=-=-;. _/ \ -
/ `-.__ ^ / .-'.--\ =-=~_=-=~=^/ _ `--./
/SRVX1 `. / / `.~-^=-=~=^=.-' '
----------------------------------------------
|
Jupyterhub
SRVX1 serves a teaching jupyterhub with a jupyterlab. It allows easy access for students and teachers. Access: https://teachinghub.wolke.img.univie.ac.at
Signup is only granted by teachers and requires a srvx1 user account. A new password is needed and a TOTP (time base one-time password) will be created.
Download/Use any of these required Authenticator Apps:
Please note that having a synchronized time on the TOTP device is required. Otherwise the time-based passwords will be out of sync and the authentication fails.
After registering the teacher/admin has to grant you access and you can login.
Software
The typcial installation of a intel-server has the INTEL Compiler suite (intel-parallel-studio
, intel-oneapi
) and the open source GNU Compilers installed. Based on these two different compilers (intel
, gnu
), there are usually two version of each scientific software.
Major Libraries:
- OpenMPI (3.1.6, 4.0.5)
- HDF5
- NetCDF (C, Fortran)
- ECCODES from ECMWF
- Math libraries e.g. intel-mkl, lapack,scalapack
- Interpreters: Python, Julia
- Tools: cdo, ncl, nco, ncview
These software libraries are usually handled by environment modules.
Currently installed modules
Please note that new versions might already be installed.
available modules |
---|
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68 | module av
------- /home/swd/spack/share/spack/modules/linux-rhel8-skylake_avx512 -------
anaconda2/2019.10-gcc-8.5.0 proj/7.1.0-gcc-8.5.0
anaconda3/2020.11-gcc-8.5.0 proj/8.1.0-gcc-8.5.0
anaconda3/2021.05-gcc-8.5.0 python/3.8.12-gcc-8.5.0
autoconf/2.69-oneapi-2021.2.0
autoconf/2.71-oneapi-2021.2.0
cdo/1.9.10-gcc-8.5.0
eccodes/2.19.1-gcc-8.5.0
eccodes/2.19.1-gcc-8.5.0-MPI3.1.6
eccodes/2.19.1-intel-20.0.4
eccodes/2.19.1-intel-20.0.4-MPI3.1.6
eccodes/2.21.0-gcc-8.5.0
eccodes/2.21.0-gcc-8.5.0-MPI3.1.6
eccodes/2.21.0-intel-20.0.4
fftw/3.3.10-gcc-8.5.0
fftw/3.3.10-gcc-8.5.0-MPI3.1.6
gcc/8.5.0-gcc-8.5rhel8
geos/3.8.1-gcc-8.5.0
hdf5/1.10.7-gcc-8.5.0
hdf5/1.10.7-gcc-8.5.0-MPI3.1.6
hdf5/1.10.7-intel-20.0.4-MPI3.1.6
hdf5/1.12.0-intel-20.0.4
hdf5/1.12.0-oneapi-2021.2.0
intel-mkl/2020.4.304-intel-20.0.4
intel-oneapi-compilers/2021.2.0-oneapi-2021.2.0
intel-oneapi-dal/2021.2.0-oneapi-2021.2.0
intel-oneapi-mkl/2021.2.0-oneapi-2021.2.0
intel-oneapi-mpi/2021.2.0-oneapi-2021.2.0
intel-parallel-studio/composer.2020.4-intel-20.0.4
libemos/4.5.9-gcc-8.5.0-MPI3.1.6
libemos/4.5.9-intel-20.0.4
libgeotiff/1.6.0-intel-20.0.4
matlab/R2020b-gcc-8.5.0
miniconda2/4.7.12.1-gcc-8.5.0
miniconda3/4.10.3-gcc-8.5.0
nco/4.9.3-intel-20.0.4
nco/5.0.1-gcc-8.5.0
ncview/2.1.8-gcc-8.5.0
ncview/2.1.8-intel-20.0.4-MPI3.1.6
netcdf-c/4.6.3-gcc-8.5.0-MPI3.1.6
netcdf-c/4.6.3-intel-20.0.4-MPI3.1.6
netcdf-c/4.7.4-gcc-8.5.0
netcdf-c/4.7.4-intel-20.0.4
netcdf-fortran/4.5.2-gcc-8.5.0-MPI3.1.6
netcdf-fortran/4.5.2-intel-20.0.4-MPI3.1.6
netcdf-fortran/4.5.3-gcc-8.5.0
netlib-lapack/3.9.1-gcc-8.5.0
netlib-lapack/3.9.1-intel-20.0.4
netlib-lapack/3.9.1-oneapi-2021.2.0
netlib-scalapack/2.1.0-gcc-8.5.0
netlib-scalapack/2.1.0-gcc-8.5.0-MPI3.1.6
openblas/0.3.18-gcc-8.5.0
opencoarrays/2.7.1-gcc-8.5.0
opencoarrays/2.7.1-intel-20.0.4
openmpi/3.1.6-gcc-8.5.0
openmpi/3.1.6-intel-20.0.4
openmpi/4.0.5-gcc-8.5.0
openmpi/4.0.5-intel-20.0.4
parallel-netcdf/1.12.2-gcc-8.5.0
parallel-netcdf/1.12.2-gcc-8.5.0-MPI3.1.6
----------------------------- /home/swd/modules ------------------------------
anaconda3/leo-current-gcc-8.3.1 intelpython/2021.4.0.3353 pypy/7.3.5
ecaccess-webtoolkit/4.0.2 intelpython/2022.0.2.155 shpc/0.0.33
ecaccess-webtoolkit/6.3.1 micromamba/0.15.2 teleport/10.1.4
enstools/v2021.11 micromamba/0.27.0 teleport/10.3.3
idl/8.2-sp1 ncl/6.6.2 xconv/1.94
|
on how to use environment modules go to Using Environment Modules
User services
There is a script collection that is accessible via the userservices
command. e.g. running
Bash |
---|
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17 | $ userservices
Usage: userservices [service] [Options]
Available Services:
------------------------------------------------------------------------
archive --- Submit files/folders to ZID Archive
fetch-sysinfo --- Display system information
filesender --- Transfer files to ACONET filesender (requires account)
fix-permissions --- fix file/directory permissions
home-dir-check --- Check home directory/configuration
modules --- Pretty print environment modules
transfersh --- Transfer files/directories (IMGW subnet)
weather --- Retrieve weather information
yopass --- Send messages/small files to YoPass (encrypted)
------------------------------------------------------------------------
These scripts are intended to help with certain known problems.
Report problems to: michael.blaschek@univie.ac.at
|
These are scripts in a common directory. Feel free to copy or edit as you like. Note that some services like filesender
require an ACONET account (accessible via your u:account).
Container Hub
Currently there is the possibility to run singularity/apptainer containers on all our Servers. This is really similar to docker, but much more secure for multi-user servers. Almost every docker container can be converted into a singularity container. Some of the build recipes use docker.
There are a number of prepared containers but more can be added. If you have a wish or an existing container useful for others please share.
YAML |
---|
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41 | containers:
- root: /home/swd/containers
- available:
- RTTOV:
- RTTOV: 12.3
- compiler: gcc:7.3.0 (anaconda)
- path: /home/swd/containers/rttov-jupyter/jup3rttov.sif
- os: centos:6.10
- python: 3.7.4
- singularity: 3.5.2
- packages:
- anaconda3
- jupyter jupyterlab numpy matplotlib pandas xarray bottleneck dask numba scipy netcdf4 cartopy h5netcdf nc-time-axis cfgrib eccodes nodejs
- apps:
- atlas
- lab
- notebook
- rtcoef
- rthelp
- rttest
- description: Use for running RTTOV simulations with Python.
- LIBEMOS:
- LIBEMOS: 4.5.1
- compiler: gcc:7.5.0
- path: /home/swd/containers/libemos-dev/libemos-dev.sif
- os: ubuntu:18.04
- singularity: 3.8.6
- packages:
- openmpi:2.1.1-8
- fftw3:3.3.7-1
- eccodes:2.6.0
- libemos
- description: Use for building flexextract with working libemos dependencies.
- TEXLIVE:
- TEXLIVE: TeX Live 2022
- compiler: gcc:12.2.0
- path: /home/swd/containers/texlive/texlive-2022.sif
- os: debian 12
- singularity: 3.8.7
- packages:
- texlive from gitlab
|
Last update:
February 29, 2024
Created:
October 14, 2021