model: fix flash attention enabling - do not check device type at this point (can be CPU) 5940103 eitanrich commited on Nov 7, 2024
VAE: Support different latent_var_log options when returning intermediate features for 3D perceptual loss 7d89bb0 eitanrich commited on Oct 31, 2024
VAE: Support retuning intermediate features for 3d perceptual loss 028b6a1 eitanrich commited on Oct 30, 2024
VAE: Support more configurations for Encoder and Decoder blocks 43d3c68 eitanrich commited on Oct 20, 2024
Merge pull request #7 from LightricksResearch/feature/fix-transformer-init-bug fc02e02 unverified Dudu Moshe commited on Oct 8, 2024
transformer3d: init mode xora never happens because lower case needed. a3498bb dudumoshe commited on Oct 8, 2024