Merge method?
Hi phr, can u share some tips for the 1st edition merging? Thanks a lot.
I used the "ModelMergeBlocks" ComfyUI nodes and then Save Checkpoint. However, I don't think it did much merging I was hoping for, as the result was very WAN 2.1, hence switching to using ModeMergeWAN2_1 which gives you much more fine grain control.
I used the "ModelMergeBlocks" ComfyUI nodes and then Save Checkpoint. However, I don't think it did much merging I was hoping for, as the result was very WAN 2.1, hence switching to using ModeMergeWAN2_1 which gives you much more fine grain control.
Thanks a lot! So u switch to ModeMergeWAN2_1 the do the latter version?
Hi, I came across your model merge and had a thought.
On this Reddit thread (https://www.reddit.com/r/StableDiffusion/comments/1n44k5t/which_wan22_workflow_are_you_using_to_mitigate
), someone shows a workflow where models are kept separate, and a high-noise model is combined with a strong application of a speed LoRA.
Now, in your case, you baked all the LoRAs directly into the safetensors merge. Would it be possible to try “baking” the speed LoRA into the noisy model first, following the approach in that Reddit thread, and then proceed with the merge? This might allow the merged model to keep the benefits of the speed LoRA without needing to apply it manually afterwards.
Do you think this is feasible?