Stable Diffusion ControlNet Depth EXPLAINED. This is a full tutorial dedicated to the ControlNet Depth preprocessor and model. In this video, I show you how to use it and give examples of what to use ControlNet Depth for.
The black-and-white output that ControlNet Depth generates is an estimated basic depth map from the reference image. You can then use Depth to generate awesome and fun AI images from it!
🎥 Watch Next:
SDXL Lora Training tutorial - https://youtu.be/d4QJg4YPm1c
\\__ Articles __//
👉 Full ControlNet Guide ( learn how to install ControlNet in automatic1111 and how to download Depth model): https://creatixai.com/stable-diffusion-controlnet-tutorial-guide/
👉 ControlNet - Depth Guide: https://creatixai.com/controlnet-depth-tutorial-stable-diffusion-a1111/
\\__ LINKS __//
Sd-webui-ControlNet Github: https://github.com/Mikubill/sd-webui-controlnet
Controlnet V1 Models: https://huggingface.co/lllyasviel/ControlNet-v1-1/tree/main
SDXL Depth model download: https://huggingface.co/lllyasviel/sd_control_collection/tree/main
\\__ TIMESTAMPS __//
00:00 - What to use Depth for?
00:38 - What is ControlNet Depth?
01:28 - Simple ControlNet Depth Workflow
02:27 - Depth Preprocessors - Midas vs Zoe vs Leres
03:09 - Preprocessors test - color change
04:10 - Preprocessors test - gender change
05:09 - ControlNet Depth for SDXL
06:05 - Img2img - changing material
07:43 - Ai Text Effects with Depth
08:59 - ControlNet Depth for Poses
10:34 - Depth for Composition
11:41 - What's Next?
12:11 - Is my Voice AI?
Hope you enjoyed this quick, stable diffusion ai tutorial! Let me know in the comments what else you'd like to see. Cheers :)