Google’s MLPerf v1.1 Training submission showcased two large (480B & 200B parameter) language models using publicly available Cloud TPU v4 Pod slices.

Source