For Z-Image-Turbo, make batch size > 1 work.#649
For Z-Image-Turbo, make batch size > 1 work.#649hinablue wants to merge 6 commits intoostris:mainfrom
Conversation
|
Quantize make the result very bad, I need more test. |
|
After 1,000 steps of testing, I found the problem was with generating the sample images; the results for the sample images were terrible. The downloaded Lokr model, however, was usable. I think I need to find the cause. |
|
I updated New Logical Flow
Pros and Cons
|
|
Lokr support just update |
|
Rollback all core changes and focus only on |
|
I tested this branch on my RTX 5070 Ti (using only Z-Image Turbo + Base model), and batch size=2 worked perfectly—no errors, training ran smoothly. Thank you very much for your contribution—this fixes a really useful feature for many of us! |
|
How do I apply this fix to my config? |
try |
|
This issue should be resolved now. |
Test on my Mac and work fine (Yeap, I also make the MPS work, but this PR only for Lokr and BS > 1).
These changes should not affect the training of other models.