Commit Graph

185 Commits

Author SHA1 Message Date
BlenderNeko
a2e18b1504 allow disabling of progress bar when sampling 2023-04-30 18:59:58 +02:00
comfyanonymous
071011aebe Mask strength should be separate from area strength. 2023-04-29 20:06:53 -04:00
comfyanonymous
870fae62e7 Merge branch 'condition_by_mask_node' of https://github.com/guill/ComfyUI 2023-04-29 15:05:18 -04:00
Jacob Segal
af02393c2a Default to sampling entire image
By default, when applying a mask to a condition, the entire image will
still be used for sampling. The new "set_area_to_bounds" option on the
node will allow the user to automatically limit conditioning to the
bounds of the mask.

I've also removed the dependency on torchvision for calculating bounding
boxes. I've taken the opportunity to fix some frustrating details in the
other version:
1. An all-0 mask will no longer cause an error
2. Indices are returned as integers instead of floats so they can be
   used to index into tensors.
2023-04-29 00:16:58 -07:00
comfyanonymous
056e5545ff Don't try to get vram from xpu or cuda when directml is enabled. 2023-04-29 00:28:48 -04:00
comfyanonymous
2ca934f7d4 You can now select the device index with: --directml id
Like this for example: --directml 1
2023-04-28 16:51:35 -04:00
comfyanonymous
3baded9892 Basic torch_directml support. Use --directml to use it. 2023-04-28 14:28:57 -04:00
Jacob Segal
e214c917ae Add Condition by Mask node
This PR adds support for a Condition by Mask node. This node allows
conditioning to be limited to a non-rectangle area.
2023-04-27 20:03:27 -07:00
comfyanonymous
5a971cecdb Add callback to sampler function.
Callback format is: callback(step, x0, x)
2023-04-27 04:38:44 -04:00
comfyanonymous
aa57136dae Some fixes to the batch masks PR. 2023-04-25 01:12:40 -04:00
comfyanonymous
c50208a703 Refactor more code to sample.py 2023-04-24 23:25:51 -04:00
comfyanonymous
7983b3a975 This is cleaner this way. 2023-04-24 22:45:35 -04:00
BlenderNeko
0b07b2cc0f gligen tuple 2023-04-24 21:47:57 +02:00
BlenderNeko
d9b1595f85 made sample functions more explicit 2023-04-24 12:53:10 +02:00
BlenderNeko
5818539743 add docstrings 2023-04-23 20:09:09 +02:00
BlenderNeko
8d2de420d3 Merge branch 'master' of https://github.com/BlenderNeko/ComfyUI 2023-04-23 20:02:18 +02:00
BlenderNeko
2a09e2aa27 refactor/split various bits of code for sampling 2023-04-23 20:02:08 +02:00
comfyanonymous
5282f56434 Implement Linear hypernetworks.
Add a HypernetworkLoader node to use hypernetworks.
2023-04-23 12:35:25 -04:00
comfyanonymous
6908f9c949 This makes pytorch2.0 attention perform a bit faster. 2023-04-22 14:30:39 -04:00
comfyanonymous
907010e082 Remove some useless code. 2023-04-20 23:58:25 -04:00
comfyanonymous
96b57a9ad6 Don't pass adm to model when it doesn't support it. 2023-04-19 21:11:38 -04:00
comfyanonymous
3696d1699a Add support for GLIGEN textbox model. 2023-04-19 11:06:32 -04:00
comfyanonymous
884ea653c8 Add a way for nodes to set a custom CFG function. 2023-04-17 11:05:15 -04:00
comfyanonymous
73c3e11e83 Fix model_management import so it doesn't get executed twice. 2023-04-15 19:04:33 -04:00
comfyanonymous
81d1f00df3 Some refactoring: from_tokens -> encode_from_tokens 2023-04-15 18:46:58 -04:00
comfyanonymous
719c26c3c9 Merge branch 'master' of https://github.com/BlenderNeko/ComfyUI 2023-04-15 14:16:50 -04:00
BlenderNeko
d0b1b6c6bf fixed improper padding 2023-04-15 19:38:21 +02:00
comfyanonymous
deb2b93e79 Move code to empty gpu cache to model_management.py 2023-04-15 11:19:07 -04:00
comfyanonymous
04d9bc13af Safely load pickled embeds that don't load with weights_only=True. 2023-04-14 15:33:43 -04:00
BlenderNeko
da115bd78d ensure backwards compat with optional args 2023-04-14 21:16:55 +02:00
BlenderNeko
752f7a162b align behavior with old tokenize function 2023-04-14 21:02:45 +02:00
comfyanonymous
334aab05e5 Don't stop workflow if loading embedding fails. 2023-04-14 13:54:00 -04:00
BlenderNeko
73175cf58c split tokenizer from encoder 2023-04-13 22:06:50 +02:00
BlenderNeko
8489cba140 add unique ID per word/embedding for tokenizer 2023-04-13 22:01:01 +02:00
comfyanonymous
92eca60ec9 Fix for new transformers version. 2023-04-09 15:55:21 -04:00
comfyanonymous
1e1875f674 Print xformers version and warning about 0.0.18 2023-04-09 01:31:47 -04:00
comfyanonymous
7e254d2f69 Clarify what --windows-standalone-build does. 2023-04-07 15:52:56 -04:00
comfyanonymous
44fea05064 Cleanup. 2023-04-07 02:31:46 -04:00
comfyanonymous
58ed0f2da4 Fix loading SD1.5 diffusers checkpoint. 2023-04-07 01:30:33 -04:00
comfyanonymous
8b9ac8fedb Merge branch 'master' of https://github.com/sALTaccount/ComfyUI 2023-04-07 01:03:43 -04:00
comfyanonymous
64557d6781 Add a --force-fp32 argument to force fp32 for debugging. 2023-04-07 00:27:54 -04:00
comfyanonymous
bceccca0e5 Small refactor. 2023-04-06 23:53:54 -04:00
comfyanonymous
28a7205739 Merge branch 'ipex' of https://github.com/kwaa/ComfyUI-IPEX 2023-04-06 23:45:29 -04:00
藍+85CD
05eeaa2de5
Merge branch 'master' into ipex 2023-04-07 09:11:30 +08:00
EllangoK
28fff5d1db fixes lack of support for multi configs
also adds some metavars to argarse
2023-04-06 19:06:39 -04:00
comfyanonymous
f84f2508cc Rename the cors parameter to something more verbose. 2023-04-06 15:24:55 -04:00
EllangoK
48efae1608 makes cors a cli parameter 2023-04-06 15:06:22 -04:00
EllangoK
01c1fc669f set listen flag to listen on all if specifed 2023-04-06 13:19:00 -04:00
藍+85CD
3e2608e12b Fix auto lowvram detection on CUDA 2023-04-06 15:44:05 +08:00
sALTaccount
60127a8304 diffusers loader 2023-04-05 23:57:31 -07:00