--- license: apache-2.0 --- # EliGen: Entity-Level Controlled Image Generation ## Introduction We propose EliGen, a novel approach that leverages fine-grained entity-level information to enable precise and controllable text-to-image generation. EliGen excels in tasks such as entity-level controlled image generation and image inpainting, while its applicability is not limited to these areas. Additionally, it can be seamlessly integrated with existing community models, such as the IP-Adapter and In-Context LoRA. * Paper: [EliGen: Entity-Level Controlled Image Generation with Regional Attention](https://arxiv.org/abs/2501.01097) * Github: [DiffSynth-Studio](https://github.com/modelscope/DiffSynth-Studio) * Model: [ModelScope](https://www.modelscope.cn/models/DiffSynth-Studio/Eligen) * Online Demo: [ModelScope EliGen Studio](https://www.modelscope.cn/studios/DiffSynth-Studio/EliGen) * Training dataset: [ModelScope Dataset](https://www.modelscope.cn/datasets/DiffSynth-Studio/EliGenTrainSet) ## Methodology  We introduce a regional attention mechanism within the DiT framework to effectively process the conditions of each entity. This mechanism enables the local prompt associated with each entity to semantically influence specific regions through regional attention. To further enhance the layout control capabilities of EliGen, we meticulously contribute an entity-annotated dataset and fine-tune the model using the LoRA framework. 1. **Regional Attention**: Regional attention is shown in the above figure, which can be easily applied to other text-to-image models. Its core principle involves transforming the positional information of each entity into an attention mask, ensuring that the mechanism only affects the designated regions. 2. **Dataset with Entity Annotation**: To construct a dedicated entity control dataset, we start by randomly selecting captions from DiffusionDB and generating the corresponding source image using Flux. Next, we employ Qwen2-VL 72B, recognized for its advanced grounding capabilities among MLLMs, to randomly identify entities within the image. These entities are annotated with local prompts and bounding boxes for precise localization, forming the foundation of our dataset for further training. 3. **Training**: We utilize LoRA (Low-Rank Adaptation) and DeepSpeed to fine-tune regional attention mechanisms using a curated dataset, enabling our EliGen model to achieve effective entity-level control. ## Usage This model was trained using [DiffSynth-Studio](https://github.com/modelscope/DiffSynth-Studio). We recommend using DiffSynth-Studio for generation. ```shell git clone https://github.com/modelscope/DiffSynth-Studio.git cd DiffSynth-Studio pip install -e . ``` 1. **Entity-Level Controlled Image Generation** EliGen achieves effective entity-level control results. See [entity_control.py](https://github.com/modelscope/DiffSynth-Studio/tree/main/examples/EntityControl/entity_control.py) for usage. 2. **Image Inpainting** To apply EliGen to image inpainting tasks, we propose an inpainting fusion pipeline that preserves non-inpainted areas while enabling precise, entity-level modifications within inpainted regions. See [entity_inpaint.py](https://github.com/modelscope/DiffSynth-Studio/tree/main/examples/EntityControl/entity_inpaint.py) for usage. 3. **Styled Entity Control** EliGen can be seamlessly integrated with existing community models. We provide an example of integrating it with IP-Adapter. See [entity_control_ipadapter.py](https://github.com/modelscope/DiffSynth-Studio/tree/main/examples/EntityControl/entity_control_ipadapter.py) for usage. 4. **Entity Transfer** We provide an example of integrating EliGen with In-Context LoRA, achieving interesting entity transfer results. See [entity_transfer.py](https://github.com/modelscope/DiffSynth-Studio/tree/main/examples/EntityControl/entity_transfer.py) for usage. 5. **Play with EliGen using UI** Download the EliGen checkpoint from [ModelScope](https://www.modelscope.cn/models/DiffSynth-Studio/Eligen) to `models/lora/entity_control` and run the following command to launch the interactive UI: ```bash python apps/gradio/entity_level_control.py ``` ## Examples ### Entity-Level Controlled Image Generation 1. Generating images with continuously changing entity positions.