Yes, avoiding extinction from AI *is* an urgent priority

  • The problem is it's not a binary problem.

    Is AI keeping around a few thousand zoo animals or lab specimens a better outcome?

    also it could be accidentally or intentional(human goals or AI) or just competition for resources.

    The biggest problem is we already do all the things that we don't want AI to do.

    War. check

    Let humans die for our own financial material gain. check.

    Destroy things necessary for human survival again for financial material gain. check.

    making highly addictive substances and activities for others people and leaving the consequences to other to deal with.

  • I find the argument that they don't shut down because if they did less concious companies would take the lead...incredibly weak. What's stopping the bad actors now? Why not just stop subsidizing them and only have nationalized labs working on it?

  • Can you imagine a CEO of a company coming out and saying, "oh, and my product could wipe out humanity", as a way to benefit their company?