Make It Real Save

Make-it-Real: Unleashing Large Multimodal Model’s Ability for Painting 3D Objects with Realistic Materials

Project README

Make-it-Real

Make-it-Real: Unleashing Large Multimodal Model’s Ability for Painting 3D Objects with Realistic Materials Ye Fang*, Zeyi Sun*, Tong Wu, Jiaqi Wang, Ziwei Liu, Gordon Wetzstein, Dahua Lin

*Equal Contribution

Demo

📜 News

🚀 [2024/4/26] The paper and project page are released!

💡 Highlights

  • 🔥 We first demonstrate that GPT-4V can effectively recognize and describe materials, allowing our model to precisely identifies and aligns materials with the corresponding components of 3D objects.
  • 🔥 We construct a Material Library containing thousands of materials with highly detailed descriptions readily for MLLMs to look up and assign.
  • 🔥 An effective pipeline for texture segmentation, material identification and matching, enabling the high-quality application of materials to 3D assets.

👨‍💻 Todo

  • Evaluation for Model-Generated Assets
  • Make-it-Real Pipeline Inference Code
  • Highly detailed Material Library annotations (generated by GPT-4V)
  • Paper and Web Demos

⚡ Quick Start

✒️ Citation

If you find our work helpful for your research, please consider giving a star ⭐ and citation 📝

@misc{fang2024makeitreal,
      title={Make-it-Real: Unleashing Large Multimodal Model's Ability for Painting 3D Objects with Realistic Materials}, 
      author={Ye Fang and Zeyi Sun and Tong Wu and Jiaqi Wang and Ziwei Liu and Gordon Wetzstein and Dahua Lin},
      year={2024},
      eprint={2404.16829},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}
Open Source Agenda is not affiliated with "Make It Real" Project. README Source: Aleafy/Make_it_Real
Stars
80
Open Issues
0
Last Commit
1 week ago
Repository

Open Source Agenda Badge

Open Source Agenda Rating