I think the risks should be prioritized based on when they are supposed to realize. I see AI and pandemics as the first priorities. These developments are ultra fast, which makes them extremely perilous. (Btw those claiming climate change would cause more pandemics do not base their argument on science.)