ComfyUI的ControlNet Aux预处理器
controlnet aux 预处理器
comfyui info
Description
{
"titleSector": {
"title": "ComfyUI ControlNet Aux 辅助预处理器",
"subtitle": "即插即用的ComfyUI ControlNet Aux节点集,用于创建ControlNet提示图像",
"backgroundImage": "https://source.unsplash.com/random/1920x1080/?ai,technology",
"buttons": [
{
"text": "了解更多",
"href": "#功能概述",
"variant": "primary"
},
{
"text": "安装指南",
"href": "#安装",
"variant": "secondary"
}
]
},
"sections": {
"overview": {
"id": "功能概述",
"title": "ControlNet Aux 功能概述",
"content": [
{
"title": "什么是ControlNet Aux辅助预处理器?",
"description": "这是一个为ComfyUI设计的ControlNet Aux即插即用节点集,用于创建ControlNet提示图像。代码从ControlNet项目的相应文件夹中复制,并连接到🤗 Hub。",
"credit": "所有功劳和版权归属于lllyasviel",
"creditLink": "https://github.com/lllyasviel"
}
],
"banner": {
"image": "https://raw.githubusercontent.com/Fannovel16/comfyui_controlnet_aux/main/examples/CNAuxBanner.jpg",
"alt": "ComfyUI ControlNet Aux Banner",
"caption": ""动漫风格,街头抗议,赛博朋克城市,一位有着粉色头发和金色眼睛的女性(看着观众)举着一个标有'ComfyUI ControlNet Aux'的霓虹粉色粗体文字的标志""
}
},
"updates": {
"id": "更新",
"title": "ControlNet Aux 更新",
"content": "前往更新页面查看ControlNet Aux最新更新。",
"link": {
"text": "更新页面",
"url": "./UPDATES.md"
}
},
"installation": {
"id": "安装",
"title": "ControlNet Aux 安装指南",
"methods": [
{
"title": "使用ComfyUI Manager安装(推荐)",
"description": "安装ComfyUI Manager并按照其中介绍的步骤安装此仓库。",
"link": {
"text": "ComfyUI Manager",
"url": "https://github.com/ltdrdata/ComfyUI-Manager"
}
},
{
"title": "替代方法",
"description": "如果您在Linux上运行,或在Windows上使用非管理员账户,您需要确保/ComfyUI/custom_nodes和comfyui_controlnet_aux具有写入权限。",
"note": "现在有一个install.bat文件,您可以运行它来安装到便携式版本(如果检测到)。否则,它将默认为系统安装,并假设您按照ComfyUI的手动安装步骤进行操作。",
"fallback": {
"title": "如果您无法运行install.bat(例如,您是Linux用户)",
"description": "打开CMD/Shell并执行以下操作:",
"steps": [
"导航到您的/ComfyUI/custom_nodes/文件夹",
"运行git clone https://github.com/Fannovel16/comfyui_controlnet_aux/",
"导航到您的comfyui_controlnet_aux文件夹",
{
"type": "code",
"label": "便携式/venv:",
"command": "path/to/ComfUI/python_embeded/python.exe -s -m pip install -r requirements.txt"
},
{
"type": "code",
"label": "使用系统Python:",
"command": "pip install -r requirements.txt"
},
"启动ComfyUI"
]
}
}
]
},
"nodes": {
"id": "节点",
"title": "ControlNet Aux 节点",
"introduction": "请注意,此ControlNet Aux仓库仅支持制作提示图像的预处理器(例如棍棒人、Canny边缘等)。",
"note": "除了Inpaint之外,所有ControlNet Aux预处理器都集成到AIO Aux Preprocessor节点中。此节点允许您快速获取预处理器,但无法设置预处理器自己的阈值参数。您需要直接使用其节点来设置阈值。",
"categories": [
{
"title": "线条提取器",
"table": {
"headers": ["预处理器节点", "sd-webui-controlnet/other", "ControlNet/T2I-Adapter"],
"rows": [
["Binary Lines", "binary", "control_scribble"],
["Canny Edge", "canny", "control_v11p_sd15_canny
control_canny
t2iadapter_canny"],
["HED Soft-Edge Lines", "hed", "control_v11p_sd15_softedge
control_hed"],
["Standard Lineart", "standard_lineart", "control_v11p_sd15_lineart"],
["Realistic Lineart", "lineart (or lineart_coarse
if coarse
is enabled)", "control_v11p_sd15_lineart"],
["Anime Lineart", "lineart_anime", "control_v11p_sd15s2_lineart_anime"],
["Manga Lineart", "lineart_anime_denoise", "control_v11p_sd15s2_lineart_anime"],
["M-LSD Lines", "mlsd", "control_v11p_sd15_mlsd
control_mlsd"],
["PiDiNet Soft-Edge Lines", "pidinet", "control_v11p_sd15_softedge
control_scribble"],
["Scribble Lines", "scribble", "control_v11p_sd15_scribble
control_scribble"],
["Scribble XDoG Lines", "scribble_xdog", "control_v11p_sd15_scribble
control_scribble"],
["Fake Scribble Lines", "scribble_hed", "control_v11p_sd15_scribble
control_scribble"],
["TEED Soft-Edge Lines", "teed", "controlnet-sd-xl-1.0-softedge-dexined
control_v11p_sd15_softedge (理论上)"],
["Scribble PiDiNet Lines", "scribble_pidinet", "control_v11p_sd15_scribble
control_scribble"],
["AnyLine Lineart", "", "mistoLine_fp16.safetensors
mistoLine_rank256
control_v11p_sd15s2_lineart_anime
control_v11p_sd15_lineart"]
]
}
},
{
"title": "法线和深度估计器",
"table": {
"headers": ["预处理器节点", "sd-webui-controlnet/other", "ControlNet/T2I-Adapter"],
"rows": [
["MiDaS Depth Map", "(normal) depth", "control_v11f1p_sd15_depth
control_depth
t2iadapter_depth"],
["LeReS Depth Map", "depth_leres", "control_v11f1p_sd15_depth
control_depth
t2iadapter_depth"],
["Zoe Depth Map", "depth_zoe", "control_v11f1p_sd15_depth
control_depth
t2iadapter_depth"],
["MiDaS Normal Map", "normal_map", "control_normal"],
["BAE Normal Map", "normal_bae", "control_v11p_sd15_normalbae"],
["MeshGraphormer Hand Refiner", "depth_hand_refiner", "control_sd15_inpaint_depth_hand_fp16"],
["Depth Anything", "depth_anything", "Depth-Anything"],
["Zoe Depth Anything", "depth_anything", "Depth-Anything"],
["Normal DSINE", "", "control_normal/control_v11p_sd15_normalbae"],
["Metric3D Depth", "", "control_v11f1p_sd15_depth
control_depth
t2iadapter_depth"],
["Metric3D Normal", "", "control_v11p_sd15_normalbae"],
["Depth Anything V2", "", "Depth-Anything"]
]
}
},
{
"title": "面部和姿势估计器",
"table": {
"headers": ["预处理器节点", "sd-webui-controlnet/other", "ControlNet/T2I-Adapter"],
"rows": [
["DWPose Estimator", "dw_openpose_full", "control_v11p_sd15_openpose
control_openpose
t2iadapter_openpose"],
["OpenPose Estimator", "openpose (detect_body)
openpose_hand (detect_body + detect_hand)
openpose_faceonly (detect_face)
openpose_full (detect_hand + detect_body + detect_face)", "control_v11p_sd15_openpose
control_openpose
t2iadapter_openpose"],
["MediaPipe Face Mesh", "mediapipe_face", "controlnet_sd21_laion_face_v2"],
["Animal Estimator", "animal_openpose", "control_sd15_animal_openpose_fp16"]
]
}
},
{
"title": "光流估计器",
"table": {
"headers": ["预处理器节点", "sd-webui-controlnet/other", "ControlNet/T2I-Adapter"],
"rows": [
["Unimatch Optical Flow", "", "DragNUWA"]
]
}
},
{
"title": "语义分割",
"table": {
"headers": ["预处理器节点", "sd-webui-controlnet/other", "ControlNet/T2I-Adapter"],
"rows": [
["OneFormer ADE20K Segmentor", "oneformer_ade20k", "control_v11p_sd15_seg"],
["OneFormer COCO Segmentor", "oneformer_coco", "control_v11p_sd15_seg"],
["UniFormer Segmentor", "segmentation", "control_sd15_seg
control_v11p_sd15_seg"]
]
}
},
{
"title": "仅T2IAdapter",
"table": {
"headers": ["预处理器节点", "sd-webui-controlnet/other", "ControlNet/T2I-Adapter"],
"rows": [
["Color Pallete", "color", "t2iadapter_color"],
["Content Shuffle", "shuffle", "t2iadapter_style"]
]
}
},
{
"title": "重新着色",
"table": {
"headers": ["预处理器节点", "sd-webui-controlnet/other", "ControlNet/T2I-Adapter"],
"rows": [
["Image Luminance", "recolor_luminance", "ioclab_sd15_recolor
sai_xl_recolor_256lora
bdsqlsz_controlllite_xl_recolor_luminance"],
["Image Intensity", "recolor_intensity", "可能与上面相同"]
]
}
}
],
"openposeInfo": {
"title": "如何获取OpenPose格式JSON?",
"userSection": {
"title": "用户端",
"description": "此工作流将图像保存到ComfyUI的输出文件夹(与输出图像相同的位置)。如果您没有找到"Save Pose Keypoints"节点,请更新此扩展",
"image": "https://raw.githubusercontent.com/Fannovel16/comfyui_controlnet_aux/main/examples/example_save_kps.png",
"alt": "Save Pose Keypoints Example"
},
"developerSection": {
"title": "开发者端",
"description": "对应于IMAGE批次中每一帧的OpenPose格式JSON数组可以从DWPose和OpenPose使用UI上的app.nodeOutputs或/history API端点获取。AnimalPose的JSON输出使用与OpenPose JSON类似的格式:",
"jsonExample": {
"code": "[\n {\n "version": "ap10k",\n "animals": [\n [[x1, y1, 1], [x2, y2, 1],..., [x17, y17, 1]],\n [[x1, y1, 1], [x2, y2, 1],..., [x17, y17, 1]],\n ...\n ],\n "canvas_height": 512,\n "canvas_width": 768\n },\n ...\n]"
},
"codeExamples": [
{
"title": "对于扩展开发者(例如Openpose编辑器):",
"language": "javascript",
"code": "const poseNodes = app.graph._nodes.filter(node => ["OpenposePreprocessor", "DWPreprocessor", "AnimalPosePreprocessor"].includes(node.type))\nfor (const poseNode of poseNodes) {\n const openposeResults = JSON.parse(app.nodeOutputs[poseNode.id].openpose_json[0])\n console.log(openposeResults) //包含每帧Openpose JSON的数组\n}"
},
{
"title": "对于API用户:",
"subtitle": "Javascript",
"language": "javascript",
"code": "import fetch from "node-fetch" //记得在"package.json"中添加"type": "module"\nasync function main() {\n const promptId = '792c1905-ecfe-41f4-8114-83e6a4a09a9f' //太懒了,不想POST /queue\n let history = await fetch(http://127.0.0.1:8188/history/${promptId}
).then(re => re.json())\n history = history[promptId]\n const nodeOutputs = Object.values(history.outputs).filter(output => output.openpose_json)\n for (const nodeOutput of nodeOutputs) {\n const openposeResults = JSON.parse(nodeOutput.openpose_json[0])\n console.log(openposeResults) //包含每帧Openpose JSON的数组\n }\n}\nmain()"
},
{
"subtitle": "Python",
"language": "python",
"code": "import json, urllib.request\n\nserver_address = "127.0.0.1:8188"\nprompt_id = '' #太懒了,不想POST /queue\n\ndef get_history(prompt_id):\n with urllib.request.urlopen("http://{}/history/{}".format(server_address, prompt_id)) as response:\n return json.loads(response.read())\n\nhistory = get_history(prompt_id)[prompt_id]\nfor o in history['outputs']:\n for node_id in history['outputs']:\n node_output = history['outputs'][node_id]\n if 'openpose_json' in node_output:\n print(json.loads(node_output['openpose_json'][0])) #包含每帧Openpose JSON的列表"
}
]
}
}
},
"examples": {
"id": "示例",
"title": "ControlNet Aux 示例",
"subtitle": "ControlNet Aux 一图胜千言",
"images": [
{
"src": "https://raw.githubusercontent.com/Fannovel16/comfyui_controlnet_aux/main/examples/ExecuteAll1.jpg",
"alt": "Example 1"
},
{
"src": "https://raw.githubusercontent.com/Fannovel16/comfyui_controlnet_aux/main/examples/ExecuteAll2.jpg",
"alt": "Example 2"
}
],
"links": {
"workflow": {
"text": "测试工作流",
"url": "https://github.com/Fannovel16/comfyui_controlnet_aux/blob/main/examples/ExecuteAll.png"
},
"inputImage": {
"text": "输入图像",
"url": "https://github.com/Fannovel16/comfyui_controlnet_aux/blob/main/examples/comfyui-controlnet-aux-logo.png"
}
}
},
"faq": {
"id": "问答",
"title": "ControlNet Aux 问答",
"items": [
{
"question": "为什么安装ControlNet Aux仓库后有些节点没有出现?",
"answer": "此ControlNet Aux仓库有一个新机制,会跳过任何无法导入的自定义节点。如果您遇到这种情况,请在Issues选项卡上创建一个问题,并附上命令行中的日志。",
"links": [
{
"text": "Issues选项卡",
"url": "https://github.com/Fannovel16/comfyui_controlnet_aux/issues"
}
]
},
{
"question": "DWPose/AnimalPose只使用CPU所以很慢。如何让它使用GPU?",
"answer": "有两种方法可以加速DWPose:使用TorchScript检查点(.torchscript.pt)或ONNXRuntime(.onnx)。TorchScript方式比ONNXRuntime稍慢,但不需要任何额外的库,仍然比CPU快得多。",
"note": "TorchScript边界框检测器与onnx姿势估计器兼容,反之亦然。",
"subsections": [
{
"title": "TorchScript",
"content": "根据此图设置bbox_detector和pose_estimator。如果输入图像理想,您可以尝试其他以.torchscript.pt结尾的边界框检测器来减少边界框检测时间。",
"image": {
"src": "https://raw.githubusercontent.com/Fannovel16/comfyui_controlnet_aux/main/examples/example_torchscript.png",
"alt": "TorchScript Example"
}
},
{
"title": "ONNXRuntime",
"content": "如果成功安装了onnxruntime并且检查点使用以.onnx结尾,它将替换默认的cv2后端以利用GPU。请注意,如果您使用的是NVidia卡,此方法目前只能在CUDA 11.8(ComfyUI_windows_portable_nvidia_cu118_or_cpu.7z)上工作,除非您自己编译onnxruntime。",
"steps": [
{
"title": "了解您的onnxruntime构建:",
"items": [
"NVidia CUDA 11.x或以下/AMD GPU:onnxruntime-gpu",
"NVidia CUDA 12.x:onnxruntime-gpu --extra-index-url https://aiinfra.pkgs.visualstudio.com/PublicPackages/_packaging/onnxruntime-cuda-12/pypi/simple/",
"DirectML:onnxruntime-directml",
"OpenVINO:onnxruntime-openvino"
],
"note": "请注意,如果这是您第一次使用ComfyUI,请在执行下一步之前测试它是否可以在您的设备上运行。"
},
"将其添加到requirements.txt",
"运行install.bat或安装部分提到的pip命令"
],
"image": {
"src": "https://raw.githubusercontent.com/Fannovel16/comfyui_controlnet_aux/main/examples/example_onnx.png",
"alt": "ONNX Example"
}
}
]
}
]
},
"resources": {
"id": "ControlNet Aux预处理器资源文件",
"title": "ControlNet Aux预处理器资源文件",
"items": [
{
"name": "anime_face_segment",
"links": [
"https://huggingface.co/bdsqlsz/qinglong_controlnet-lllite/blob/main/Annotators/UNet.pth",
"https://huggingface.co/skytnt/anime-seg/blob/main/isnetis.ckpt"
]
},
{
"name": "densepose",
"links": [
"https://huggingface.co/LayerNorm/DensePose-TorchScript-with-hint-image/blob/main/densepose_r50_fpn_dl.torchscript"
]
},
{
"name": "dwpose",
"subsections": [
{
"name": "bbox_detector",
"description": "可以是",
"links": [
"https://huggingface.co/yzd-v/DWPose/blob/main/yolox_l.onnx",
"https://huggingface.co/hr16/yolox-onnx/blob/main/yolox_l.torchscript.pt",
"https://huggingface.co/hr16/yolo-nas-fp16/blob/main/yolo_nas_l_fp16.onnx",
"https://huggingface.co/hr16/yolo-nas-fp16/blob/main/yolo_nas_m_fp16.onnx",
"https://huggingface.co/hr16/yolo-nas-fp16/blob/main/yolo_nas_s_fp16.onnx"
]
},
{
"name": "pose_estimator",
"description": "可以是",
"links": [
"https://huggingface.co/hr16/DWPose-TorchScript-BatchSize5/blob/main/dw-ll_ucoco_384_bs5.torchscript.pt",
"https://huggingface.co/yzd-v/DWPose/blob/main/dw-ll_ucoco_384.onnx"
]
}
]
},
{
"name": "animal_pose (ap10k)",
"subsections": [
{
"name": "bbox_detector",
"description": "可以是",
"links": [
"https://huggingface.co/yzd-v/DWPose/blob/main/yolox_l.onnx",
"https://huggingface.co/hr16/yolox-onnx/blob/main/yolox_l.torchscript.pt",
"https://huggingface.co/hr16/yolo-nas-fp16/blob/main/yolo_nas_l_fp16.onnx",
"https://huggingface.co/hr16/yolo-nas-fp16/blob/main/yolo_nas_m_fp16.onnx",
"https://huggingface.co/hr16/yolo-nas-fp16/blob/main/yolo_nas_s_fp16.onnx"
]
},
{
"name": "pose_estimator",
"description": "可以是",
"links": [
"https://huggingface.co/hr16/DWPose-TorchScript-BatchSize5/blob/main/rtmpose-m_ap10k_256_bs5.torchscript.pt",
"https://huggingface.co/hr16/UnJIT-DWPose/blob/main/rtmpose-m_ap10k_256.onnx"
]
}
]
},
{
"name": "face_yolox",
"links": [
"https://huggingface.co/hr16/yolo-nas-fp16/blob/main/yolo_nas_l_fp16.onnx",
"https://huggingface.co/hr16/yolo-nas-fp16/blob/main/yolo_nas_m_fp16.onnx",
"https://huggingface.co/hr16/yolo-nas-fp16/blob/main/yolo_nas_s_fp16.onnx"
]
},
{
"name": "hand_yolox",
"links": [
"https://huggingface.co/hr16/yolo-nas-fp16/blob/main/yolo_nas_l_fp16.onnx",
"https://huggingface.co/hr16/yolo-nas-fp16/blob/main/yolo_nas_m_fp16.onnx",
"https://huggingface.co/hr16/yolo-nas-fp16/blob/main/yolo_nas_s_fp16.onnx"
]
},
{
"name": "hed",
"links": [
"https://huggingface.co/lllyasviel/Annotators/blob/main/ControlNetHED.pth"
]
},
{
"name": "leres",
"links": [
"https://huggingface.co/lllyasviel/Annotators/blob/main/res101.pth",
"https://huggingface.co/lllyasviel/Annotators/blob/main/latest_net_G.pth"
]
},
{
"name": "lineart",
"links": [
"https://huggingface.co/lllyasviel/Annotators/blob/main/sk_model.pth",
"https://huggingface.co/lllyasviel/Annotators/blob/main/sk_model2.pth"
]
},
{
"name": "lineart_anime",
"links": [
"https://huggingface.co/lllyasviel/Annotators/blob/main/netG.pth"
]
},
{
"name": "manga_line",
"links": [
"https://huggingface.co/lllyasviel/Annotators/blob/main/erika.pth"
]
},
{
"name": "midas",
"links": [
"https://huggingface.co/lllyasviel/Annotators/blob/main/dpt_hybrid-midas-501f0c75.pt"
]
},
{
"name": "mlsd",
"links": [
"https://huggingface.co/lllyasviel/Annotators/blob/main/mlsd_large_512_fp32.pth"
]
},
{
"name": "normal_bae",
"links": [
"https://huggingface.co/lllyasviel/Annotators/blob/main/scannet.pt"
]
},
{
"name": "oneformer",
"subsections": [
{
"name": "coco",
"description": "",
"links": [
"https://huggingface.co/lllyasviel/Annotators/blob/main/250_16_swin_l_oneformer_coco_100ep.pth"
]
},
{
"name": "ade20k",
"description": "",
"links": [
"https://huggingface.co/lllyasviel/Annotators/blob/main/150_16_swin_l_oneformer_ade20k_160k.pth"
]
}
]
},
{
"name": "openpose",
"links": [
"https://huggingface.co/lllyasviel/Annotators/blob/main/body_pose_model.pth",
"https://huggingface.co/lllyasviel/Annotators/blob/main/hand_pose_model.pth",
"https://huggingface.co/lllyasviel/Annotators/blob/main/facenet.pth"
]
},
{
"name": "pidinet",
"links": [
"https://huggingface.co/lllyasviel/Annotators/blob/main/table5_pidinet.pth"
]
},
{
"name": "uniformer",
"links": [
"https://huggingface.co/lllyasviel/Annotators/blob/main/upernet_global_small.pth"
]
},
{
"name": "zoe",
"links": [
"https://huggingface.co/lllyasviel/Annotators/blob/main/ZoeD_M12_N.pt"
]
}
],
"note": {
"text": "更多资源文件请参考",
"link": {
"text": "GitHub仓库",
"url": "https://github.com/Fannovel16/comfyui_controlnet_aux"
}
}
}
},
"styles": {
"theme": {
"primaryColor": "#5E6AD2",
"secondaryColor": "#E2E5F4",
"backgroundLight": "#ffffff",
"textLight": "#1f2937",
"backgroundDark": "#111827",
"textDark": "#f3f4f6"
}
}
}