Simulation tools for thermal management of datacenters help to improve layout of new builds or analyse thermalproblems in existing data centers. The development of LBMon remote GPUs as an approach for such simulations is discussedmaking use of VirtualGL and prioritised multi-threadedimplementations of an existing LBM code. The simulation isconfigured to model an existing and highly monitored test datacenter. Steady-state root mean square averages of measured andsimulated temperatures are compared showing good agreement.The full capability of this simulation approach is demonstratedwhen comparing rack temperatures against a time varyingworkload, which employs time-dependent boundary conditions.