There's no sign of the "hard problem of consciousness" ever becoming tractable so waiting isn't likely to provide information for decision making
Anthropic are the only ones publicly seriously thinking about this now, and for that reason I think most likely to achieve AGI when agency, autonomy, alignment and ethics are all intertwined
Anthropic are the only ones publicly seriously thinking about this now, and for that reason I think most likely to achieve AGI when agency, autonomy, alignment and ethics are all intertwined