Red teaming is great for identifying one source of risk - namely, malicious acts and the accident risks that they foreshadow.
But absent intentional consideration of likely harms that arise when everything works WELL and tech proliferates, red teaming won't anticipate all likely risks and impacts.
add a skeleton here at some point
5 months ago