Title: Sketch2Shape: AI-powered Local 3D Shape Editing with Sketch-based Guidance
Abstract:
3D modeling often involves intricate and time-consuming processes to achieve desired shapes and textures. We introduce Sketch2Shape, a novel approach that leverages the intuitiveness of sketching for efficient and precise local editing of 3D shapes. Our method employs a deep learning model trained to interpret user sketches in the context of a 3D model and execute corresponding modifications with accuracy and finesse. The model analyzes the sketch input, recognizes the intended modification type and location, and seamlessly integrates the changes into the 3D model while preserving its overall structure and coherence. Sketch2Shape supports various local editing operations, including texture modification, shape deformation, and part addition. We demonstrate the effectiveness of our approach through extensive experiments and user studies, showcasing its potential to streamline 3D modeling workflows and enhance creative exploration.
Keywords:
3D shape editing, sketch-based interface, human-computer interaction, deep learning, computer graphics, generative modeling
Table of Contents
-
Introduction
- 1.1 Motivation and Problem Statement
Briefly discuss the limitations of current 3D modeling tools and the need for more intuitive and efficient editing techniques.
- 1.2 Proposed Appr
- 1.1 Motivation and Problem Statement

最低0.47元/天 解锁文章

被折叠的 条评论
为什么被折叠?



