Benchmarking is a common task to evaluate the hard- and software environment of an HPC system during its procurement phase. As HPC systems also evolve over time by updating libraries, software packages or by installing new hardware components, all those system changes can influence the performance of user applications as well. This leads to the need for an automatic benchmarking environment to allow a continuous performance evaluation.
The JUBE benchmarking environment provides a flexible, lightweight, script based framework to setup benchmark tasks on top of generic benchmark applications or by using full user applications. The environment allows controlling major aspects of the benchmark execution such as the parameter variation handling, the workflow execution, data handling, asynchronous execution to support HPC job submission, and the benchmark result extraction.
The talk will present the capabilities of the current generation of the JUBE environment. It will present the general basics how to port and configure a benchmark application to make it available within JUBE and will discuss possible use cases such as fully automated scheduled benchmark setups.