透過您的圖書館登入
IP:3.15.6.77
  • 學位論文

利用雙向材質函式的毛髮繪圖

Fur Rendering with Bidirectional Texture Function

指導教授 : 張鈞法

摘要


在這篇論文裡,我們試著使用雙向材質函式來達成即時的毛髮繪製。我們的流程主要可以分為兩個部分:雙向材質函式的取得以及雙向材質函式的算圖。在取得的部分,我們不是使用傳統方法去擷取真實毛髮的材質,而是實做一套以模擬的方法產生看起來真實的毛髮。利用改變不同光源以及視點的方向,我們可以對一塊虛擬產生的毛髮樣本擷取數張圖片。這些圖片就是我們所需要的雙向材質函式的資料。而在算圖的部分,我們使用了一個能夠使用在任何物體表面有效率的雙向材質函式演算法。將第一步驟中取得的圖片重新排列然後再簡化,使得這些資料小到足以完全讀進專門處理圖形的硬體,算出最後的圖。我們的方法是由兩種性質不相近的方法所構成,而這個方法的限制產生在他們中間的資料傳遞。如果有太多長的毛髮的話,那超出某個長度的毛髮將會被剔除。在速度上來說,雖然我們的前置處理時間很長,但是毛髮生長的方式能夠受到完全的控制。使用雙向材質函式的算圖所需的時間很短,能夠即時地看見光影變化產生的影響。

關鍵字

毛髮 雙向材質函式 材質

並列摘要


We make an attempt at real-time rendering of fur using Bidirectional Texture Function (BTF). The process can be divided mainly into two parts: acquisition, and rendering of BTF. For the first part, instead of capturing real fur, we implement a basic fur rendering process to generate realistic-looking fur. Controlling the viewing and lighting parameters for the model of the fur sample, we capture serveral images. These images serve as our BTF database. For the second part, we implement an efficient BTF rendering algorithm that can be applied on arbitrary surfaces. Reading the images in the first step, we arrange and simplify them. Then the database is capable of being loaded into graphics hardware to render. Our work is a combination of two different kinds of work. The limitation falls on the bridge connecting them, the captured images of the fur sample. Too many characteristics would be cut off if the length of the fur is long. As for speed, the pre-processing time is long, but the growth of the fur is better controlled. The rendering using BTF is fast, and the instant relighting of the fur is possible.

並列關鍵字

fur fur rendering BTF bidirectional texture function texture

參考文獻


Synthesis and Rendering of Bidirectional Texture Functions. Eurographics
[1] Yosuke Bando, Bing-Yu Chen, and Tomoyuki Nishita. Animating Hair with
Loosely Connected Particles. Eurographics, 2003.
[2] Armin Bruderlin. A method to generate wet and broken-up animal fur. The
Journal of Visualization and Computer Animation, pages 249–259, 2000.

延伸閱讀