From 10c4da6c6a25171f9a80cd61405865ebcc2226bc Mon Sep 17 00:00:00 2001 From: comfyanonymous <121283862+comfyanonymous@users.noreply.github.com> Date: Wed, 9 Jul 2025 03:36:12 -0400 Subject: [PATCH] Updated Which GPU should I buy for ComfyUI (markdown) --- Which-GPU-should-I-buy-for-ComfyUI.md | 30 ++++++++++++--------------- 1 file changed, 13 insertions(+), 17 deletions(-) diff --git a/Which-GPU-should-I-buy-for-ComfyUI.md b/Which-GPU-should-I-buy-for-ComfyUI.md index 89c9cf2..2141c79 100644 --- a/Which-GPU-should-I-buy-for-ComfyUI.md +++ b/Which-GPU-should-I-buy-for-ComfyUI.md @@ -47,27 +47,23 @@ Unsupported cards might be a real pain to get running. # C Tier +## Intel (Linux + Windows) + +Officially supported in pytorch. People seem to get it working fine but I had trouble with my integrated intel GPU. + +## AMD (Windows) + +Unofficial pytorch rocm builds for windows have come out that work decently but they are still a bit of a pain to get working properly. + +Things might improve in the future once they have official pytorch ROCm working on windows. + +# D Tier + ## Mac with Apple silicon Officially supported in pytorch. It works but they love randomly breaking things with OS updates. -Very slow. A lot of ops are not properly supported. - -## Intel (Linux + Windows) - -It works but it requires a custom pytorch extension and there are sometimes some weird issues. - -I expect things to improve over time especially once it is officially supported in pytorch. - -# D Tier - -## AMD (Windows) - -It requires a pytorch extension (pytorch DirectML) or a custom zluda pytorch build. - -You will have a painful experience. - -Things might improve in the future once they have pytorch ROCm working on windows. +Very slow. A lot of ops are not properly supported. No fp8 support at all. # F Tier