#AI绘画的神奇魔力# 恢复xformer的版本的方法,xformer的版本错误会导致如下的错误: NotlmplementedError: No operator found for `memory_efficient_attention_forward`with inputs: query:shape=(8, 1024, 8, 40) (torch.float16) key : shape=(8, 1024, 8, 40) (torch.float16) value : shape=(8, 1024, 8,40) (torch.float16) attn_bias : <class 'NoneType'> p : 0.0 `decoderF`is not supported because: xFormerswasn't build with CUDA support attn _ bias type is <class 'NoneType'> operator wasn't built - see `pythonm xformers.info` for more info * flshattF@0.0.0` is not supported because: xFormers wasn't build withCUDA support operator wasn't built - see ` python -m xformers.info` for more info `tritonflashattF is notsupported because: xFormers wasn't build with CUDA support operator wasn't built - see `python -mxformers.info` for more info triton is not available cutlassF is not supported because: xFormers wasn'tbuild with CUDA support operator wasn't built - see "python -m xformers.info for more info