2022-11-23 (Summary of last few weeks)
-
New storage class; S3 support. We now have a generic storage class, which
allows for special URLs anywhere anywhere you can usually specify a URL,
e.g.CHECKPOINT_URL
,dest_url
(after dreambooth training), and the new
MODEL_URL
(see below). URLs like “s3:///bucket/filename” will work how
you expect, but definitely read docs/storage.md to understand the format better. Note in particular the triple forwardslash (“///”) in the beginning to use the default S3 endpoint. -
Dreambooth training, working but still in development. See this forum post for more info.
-
PRECISION
build var, defaults to"fp16"
, set to""
to use the model defaults (generally fp32). -
CHECKPOINT_URL
conversion:- Crash / stop build if conversion fails (rather than unclear errors later on)
- Force
cpu
loading even for models that would otherwise default to GPU.
This fixes certain models that previously crashed in build stage (where GPU
is not available). -
--extract-ema
on conversion since these are the more important weights for
inference. -
CHECKPOINT_CONFIG_URL
now let’s to specify a specific config file for
conversion, to use instead of SD’s defaultv1-inference.yaml
.
-
MODEL_URL
. If your model is already in diffusers format, but you don’t
host it on HuggingFace, you can now have it downloaded at build time. At
this stage, it should be a.tar.zst
file. This is an alternative to
CHECKPOINT_URL
which downloads a.ckpt
file and converts to diffusers. -
test.py
:- New
--banana
arg to run the test on banana. Set environment variables
BANANA_API_KEY
andBANANA_MODEL_KEY
first. - You can now add to and override a test’s default json payload with:
--model-arg prompt="hello"
--call-arg MODEL_ID="my-model"
- Support for extra timing data (e.g. dreambooth sends
train
andupload
timings).
- New
-
Dev: better caching solution. No more unruly
root-cache
directory. See
CONTRIBUTING.md for more info.