论文标题
3D资产的基于夹的神经邻居风格转移
CLIP-based Neural Neighbor Style Transfer for 3D Assets
论文作者
论文摘要
我们提出了一种将样式从一组图像转移到3D对象的方法。资产的质地外观通过使用预验证的深神经网络损失在管道中的可区分渲染器中进行了优化。更具体地说, 我们利用最近邻居的功能匹配损失与clip-resnet50从图像中提取样式。我们表明,基于夹子的样式损失通过更多地关注几何形状的纹理,从而为基于VGG的损失提供了不同的外观。 此外,我们扩展了损失以支持多个图像,并可以对调色板进行基于损失的控制,并从样式图像中提取自动调色板。
We present a method for transferring the style from a set of images to a 3D object. The texture appearance of an asset is optimized with a differentiable renderer in a pipeline based on losses using pretrained deep neural networks. More specifically, we utilize a nearest-neighbor feature matching loss with CLIP-ResNet50 to extract the style from images. We show that a CLIP- based style loss provides a different appearance over a VGG-based loss by focusing more on texture over geometric shapes. Additionally, we extend the loss to support multiple images and enable loss-based control over the color palette combined with automatic color palette extraction from style images.