Commit Graph

  • 042a905c37
    Open yaml files with utf-8 encoding for extra_model_paths.yaml (#6807) Robin Huang 2025-02-13 17:39:04 -0800
  • 019c7029ea Add a way to set a different compute dtype for the model at runtime. comfyanonymous 2025-02-13 20:34:03 -0500
  • 755ee08a77 Fix test assertion. robinjhuang 2025-02-13 13:30:28 -0800
  • 08597ca9a1 Using utf-8 encoding for yaml files. robinjhuang 2025-02-13 12:01:28 -0800
  • 6ad56a5961 [Edited] Refactor code to optimize performance MayureshMore 2025-02-13 08:50:21 -0800
  • 8773ccf74d Better memory estimation for ROCm that support mem efficient attention. comfyanonymous 2025-02-13 08:32:36 -0500
  • be7e3d4d69
    Merge branch 'comfyanonymous:master' into model-sampling-cpu dave-juicelabs 2025-02-12 13:40:22 -0600
  • 1d5d6586f3 Fix ruff. comfyanonymous 2025-02-12 06:49:16 -0500
  • 35740259de
    mix_ascend_bf16_infer_err (#6794) zhoufan2956 2025-02-12 19:48:11 +0800
  • ab888e1e0b Add add_weight_wrapper function to model patcher. comfyanonymous 2025-02-12 05:49:00 -0500
  • 7ed909f457 support system prompt and cfg renorm in Lumina2 lzyhha 2025-02-12 10:41:04 +0000
  • a321f9ddf3 mix_ascend_bf16_infer_err zhoufan2956 2025-02-12 17:13:10 +0800
  • d2504fb701 Merge branch 'master' into worksplit-multigpu Jedrzej Kosinski 2025-02-11 22:34:51 -0600
  • d9f0fcdb0c Cleanup. comfyanonymous 2025-02-11 17:17:03 -0500
  • b124256817
    Fix for running via DirectML (#6542) HishamC 2025-02-11 14:11:32 -0800
  • f594ea41f5 update casual mask calculation Chowdhury, Hisham 2025-02-11 11:41:17 -0800
  • af6ce81f00 CodeBeaverAI 2025-02-11 15:47:58 +0100
  • b83869eb3c test: Add coverage improvement test for tests/test_cli_args.py CodeBeaverAI 2025-02-11 15:47:57 +0100
  • af4b7c91be Make --force-fp16 actually force the diffusion model to be fp16. comfyanonymous 2025-02-11 08:31:46 -0500
  • e57d2282d1
    Fix incorrect Content-Type for WebP images (#6752) bananasss00 2025-02-11 12:48:35 +0300
  • e003633be0 mark node as beta bymyself 2025-02-11 01:40:55 -0700
  • 57be03b31f use literal type for image_folder field bymyself 2025-02-11 00:48:52 -0700
  • 252faf810a update node_typing.py bymyself 2025-02-11 00:46:42 -0700
  • df56b042c7 add route for input/output/temp files bymyself 2025-02-11 00:45:22 -0700
  • 673d758625 add LoadImageOutput node bymyself 2025-02-11 00:34:39 -0700
  • ebfe7a5679 fix the issue for model first inference with lora ethan 2025-02-10 19:54:44 -0800
  • 9ab14711e5 Remove deepcopy usage in PromptQueue catboxanon 2025-02-10 15:12:54 -0500
  • 4027466c80 Make lumina model work with any latent resolution. comfyanonymous 2025-02-10 00:24:20 -0500
  • 095d867147 Remove useless function. model-paths-helper comfyanonymous 2025-02-09 07:01:38 -0500
  • caeb27c3a5
    res_multistep: Fix cfgpp and add ancestral samplers (#6731) Pam 2025-02-09 05:39:58 +0500
  • 3d06e1c555 Make error more clear to user. comfyanonymous 2025-02-08 18:57:24 -0500
  • 43a74c0de1
    Allow FP16 accumulation with --fast (#6453) catboxanon 2025-02-08 17:00:56 -0500
  • d2757bf29c Fix incorrect Content-Type for WebP images bananasss00 2025-02-08 23:51:20 +0300
  • af93c8d1ee Document which text encoder to use for lumina 2. comfyanonymous 2025-02-08 06:54:03 -0500
  • 0c2067ada6
    add new file endpoints("v1") to use query parameters instead of path parameters for proxy compatibility bigcat88 2025-01-07 09:27:54 +0200
  • 7b82959572
    Merge branch 'comfyanonymous:master' into directMLChanges HishamC 2025-02-07 21:26:08 -0800
  • 2bde3ff5da remove light_intensity and fov from load3d Terry Jia 2025-02-07 16:38:17 -0500
  • 832e3f5ca3
    Fix another small bug in attention_bias redux (#6737) Raphael Walker 2025-02-07 20:44:43 +0100
  • b03763bca6 Merge branch 'multigpu_support' into worksplit-multigpu Jedrzej Kosinski 2025-02-07 13:27:49 -0600
  • 9607709049 oh shit wait there was another bug Raphael Walker 2025-02-07 09:37:22 +0100
  • 9bdb94693d
    Merge branch 'comfyanonymous:master' into master Raphael Walker 2025-02-07 09:36:01 +0100
  • 079eccc92a Don't compress http response by default. comfyanonymous 2025-02-07 03:29:12 -0500
  • a5fe6bf9ce res_multistep: Fix cfgpp and add ancestral samplers Pam 2025-02-07 05:49:51 +0500
  • b6951768c4
    fix a bug in the attn_masked redux code when using weight=1.0 (#6721) Raphael Walker 2025-02-06 22:51:16 +0100
  • b55c224765
    Merge branch 'master' into directMLChanges HishamC 2025-02-06 12:28:09 -0800
  • 75ec851324 fix formating Chowdhury, Hisham 2025-02-06 12:23:33 -0800
  • fca304debf
    Update frontend to v1.8.14 (#6724) Comfy Org PR Bot 2025-02-07 00:43:10 +0900
  • 62d3a3057d Update frontend to v1.8.14 huchenlei 2025-02-06 15:29:08 +0000
  • 476aa79b64 Let --cuda-device take in a string to allow multiple devices (or device order) to be chosen, print available devices on startup, potentially support MultiGPU Intel and Ascend setups Jedrzej Kosinski 2025-02-06 08:44:07 -0600
  • 36e4b81b33 Added new files Sachin2501 2025-02-06 19:49:59 +0530
  • 441cfd1a7a Merge branch 'master' into multigpu_support Jedrzej Kosinski 2025-02-06 08:10:48 -0600
  • d7e8d0af01 fix a bug in the attn_masked redux code when using weight=1.0 Raphael Walker 2025-02-06 14:22:05 +0100
  • 14880e6dba Remove some useless code. comfyanonymous 2025-02-06 05:00:19 -0500
  • 9b608b6f92 Remove debug logging for skipped custom nodes in prestartup script execution younes127001 2025-02-06 02:51:46 +0100
  • c2e4b96f85 Add support for including custom nodes via environment variable younes127001 2025-02-06 02:28:51 +0100
  • f1059b0b82
    Remove unused GET /files API endpoint (#6714) Chenlei Hu 2025-02-05 18:48:36 -0500
  • debabccb84 Bump ComfyUI version to v0.3.14 v0.3.14 comfyanonymous 2025-02-05 15:47:46 -0500
  • 37cd448529 Set the shift for Lumina back to 6. comfyanonymous 2025-02-05 14:49:52 -0500
  • 1d56bc20b9 Remove unused GET /files API endpoint huchenlei 2025-02-05 13:49:33 -0500
  • 94f21f9301 Upcasting rope to fp32 seems to make no difference in this model. comfyanonymous 2025-02-05 04:32:47 -0500
  • 4f989510b4
    add SaveImageWEBP node TechnoByte 2025-02-05 10:30:52 +0100
  • 60653004e5 Use regular numbers for rope in lumina model. comfyanonymous 2025-02-05 04:16:59 -0500
  • 64e9c1edb4 also load hunyuan loras rickard 2025-02-05 07:46:47 +0100
  • a57d635c5f Fix lumina 2 batches. comfyanonymous 2025-02-04 21:48:11 -0500
  • 64bc91467c
    Merge 9fedd24fc4 into 016b219dcc Saquib Alam 2025-02-05 00:00:08 +0530
  • 016b219dcc Add Lumina Image 2.0 to Readme. comfyanonymous 2025-02-04 08:08:36 -0500
  • 8ac2dddeed Lower the default shift of lumina to reduce artifacts. comfyanonymous 2025-02-04 06:50:37 -0500
  • 3e880ac709 Fix on python 3.9 comfyanonymous 2025-02-04 04:20:56 -0500
  • e5ea112a90 Support Lumina 2 model. comfyanonymous 2025-02-04 03:56:00 -0500
  • 8d88bfaff9
    allow searching for new .pt2 extension, which can contain AOTI compiled modules (#6689) Raphael Walker 2025-02-03 23:07:35 +0100
  • 5cfe39bb89 allow searching for new .pt2 extension, which can contain AOTI compiled modules Raphael Walker 2025-02-03 14:08:40 +0100
  • ed4d92b721 Model merging nodes for cosmos. comfyanonymous 2025-02-03 03:31:39 -0500
  • 932ae8d9ca
    Update frontend to v1.8.13 (#6682) Comfy Org PR Bot 2025-02-03 07:54:44 +0900
  • 0b3107e99a Update frontend to v1.8.13 huchenlei 2025-02-02 22:47:53 +0000
  • 44e19a28d3 Use maximum negative value instead of -inf for masks in text encoders. comfyanonymous 2025-02-02 09:45:07 -0500
  • 0a0df5f136
    better guide message for sageattention (#6634) Dr.Lt.Data 2025-02-02 23:26:47 +0900
  • 24d6871e47
    add disable-compres-response-body cli args; add compress middleware; (#6672) KarryCharon 2025-02-02 22:24:55 +0800
  • 99a5c1068a Merge branch 'master' into multigpu_support Jedrzej Kosinski 2025-02-02 03:19:18 -0600
  • 10184d8314 add disable-compres-response-body cli args; add compress middleware; karrcharon 2025-02-02 11:26:40 +0800
  • 9e1d301129 Only use stable cascade lora format with cascade model. comfyanonymous 2025-02-01 06:35:22 -0500
  • 768e035868
    Add node for preview 3d animation (#6594) Terry Jia 2025-01-31 13:09:07 -0500
  • 669e0497ea
    Update frontend to v1.8.12 (#6662) Comfy Org PR Bot 2025-02-01 03:07:37 +0900
  • ed1c7b50d4 Update frontend to v1.8.12 huchenlei 2025-01-31 17:48:09 +0000
  • 49aaad7c21 Only add PNG chunks if metadata is enabled catboxanon 2025-01-31 11:40:28 -0500
  • 541dc08547 Update Readme. comfyanonymous 2025-01-31 08:35:48 -0500
  • b6b475191d Add sqlite db pythongosssss 2025-01-30 21:48:53 +0000
  • a4aba18d29 PNG cICP chunk support catboxanon 2025-01-30 15:44:41 -0500
  • 8d8dc9a262 Allow batch of different sigmas when noise scaling. comfyanonymous 2025-01-30 06:49:52 -0500
  • 6edd534542 Merge tag 'v0.3.13' into r42_comfyui_v0.3.13 Render Node 2025-01-30 10:55:43 +0000
  • 0a03bc230a Merge branch 'master' of https://github.com/Random42-Scientific-Communication/r42_comfyui Render Node 2025-01-30 10:54:09 +0000
  • d73604e85e Commit Files Render Node 2025-01-30 10:53:18 +0000
  • f49b485e1f commit changes Render Node 2025-01-30 10:26:39 +0000
  • 77e9294c08 add Query Device ethan 2025-01-30 00:20:58 -0800
  • 2f98c24360 Update Readme with link to instruction for Nvidia 50 series. comfyanonymous 2025-01-30 02:12:43 -0500
  • ef85058e97 Bump ComfyUI version to v0.3.13 v0.3.13 comfyanonymous 2025-01-29 16:07:12 -0500
  • f9230bd357 Update the python version in some workflows. comfyanonymous 2025-01-29 15:54:13 -0500
  • 02747cde7d Carry over change from _calc_cond_batch into _calc_cond_batch_multigpu Jedrzej Kosinski 2025-01-29 11:10:23 -0600
  • 317af7201f remove history commit ethan 2025-01-29 07:05:47 -0800
  • d1f61cca5e add openvino to torch compile ethan 2025-01-29 07:03:31 -0800
  • 537c27cbf3 Bump default cuda version in standalone package to 126. comfyanonymous 2025-01-29 08:13:33 -0500