From what little I know about atomic clocks, the typical caesium and rubidium clocks work off the microwave lines from hyperfine transitions. Electronics transitions allow use of light wavelengths and correspondingly higher precision. For higher precision yet one needs to cool the atoms. All this is established art.
What I'd like to know more about are what specific claims are made for these clocks and how they differ from the chip-scale rubidium references NIST is publishing on. Specifically, what's the root Allan variance? Settling time? Time transfer methodology?
The real neat trick is time transfer from device to device and handling the bookkeeping appropriately as one transfers time from stationary devices to moving devices such as aircraft. At caesium stability, relativity becomes apparent even for modest accelerations.
Mines the one with the 100 gram atomic clock in the pocket: https://www.orolia.com/products/atomic-clocks-oscillators