How to Run Inference via the CLI

Warning

The information below is incomplete and requires extra work. Use it at your own risk!

Note

  • On Apple Silicon (MPS), autocast is disabled by PyTorch; prefer --dtype float16.

Even though it is recommended to use accelerated hardware (GPUs), CPU executions are also possible. It is assumed that there is enough of the RAM available (>=32GB).

--device auto chooses CUDA → MPS → CPU. Autocast is used on CUDA; for MPS, prefer --dtype float16.

CLI offers a unified interface that can be combined in a set of command in a single bash script, for example. This is the recommended way to execute inference with the provided models.

We provide a few examples on how to use YAML-based config files alongside the CLI. They can be found under /src/emmi_inference/examples/configs. These configs should work for three model architectures (Transolver, UPT and AB-UPT). The checkpoints to be acquired/produced independently.

There are two different datasets used: ShapeNetCar is used by Transolver and UPT, whilst AB-UPT uses DrivAerML. This comes with different configs when it comes to the data part, as well as the collator. Please refer to the example files for more details.

A working Transolver example is given below:

emmi-infer run \
    /src/emmi_inference/examples/configs/example_config_transolver.yaml \
    transolver \
    "/data/transolver/e8ek00ze/checkpoints/transolver cp=latest model.th" \
    --device cpu

The list of model types is available within the ModelRegistry under /src/emmi_inference/models/registry.py.

The following commands will run UPT and AB-UPT respectively:

emmi-infer run \
    /src/emmi_inference/examples/configs/example_config_upt.yaml \
    upt \
    "/data/upt/to5u4s5i/checkpoints/upt cp=best_model.loss.test.total model.th" \
    --device cpu
emmi-infer run \
    /src/emmi_inference/examples/configs/example_config_abupt.yaml \
    abupt \
    /data/abupt/checkpoints/ab-upt-drivaerml-tutorial.th \
    --device cpu

Note

Make sure to update the paths for your datasets in the config files!