已閱讀5頁,還剩8頁未讀, 繼續(xù)免費閱讀
版權說明:本文檔由用戶提供并上傳,收益歸屬內容提供方,若內容存在侵權,請進行舉報或認領
文檔簡介
英文原文 Extending Blender: Development of a Haptic Authoring ToolAbstract -In this paper, we present our work to extend a well known 3D graphic modeler -Blender -to support haptic modeling and rendering. The extension tool is named HAMLAT (haptic Application Markup Language Authoring Tool). We describe the modifications and additions to the Blender source code which have been used to create HAMLAT Furthermore, we present and discuss the design decisions used when developing HAMLAT, and also an implementation road map which describes the changes to the Blender source code. Finally, we conclude with discussion of our future development and research avenues. Keywords -Haptics, HAML, Graphic Modelers, Blender, Virtual Environments. I. Introduction A. Motivation The increasing adoption of haptic modality in human-computer interaction paradigms has led to a huge demand for new tools that help novice users to author and edit haptic applications. Currently, the haptic application development process is a time consuming experience that requires programming expertise. The complexity of haptic applications development rises from the fact that the haptic application components (such as the haptic API, the device, the haptic rendering algorithms, etc.) need to interact with the graphic components in order to achieve synchronicity. Additionally, there is a lack of application portability as the application is tightly coupled to a specific device that necessitates the use of its corresponding API. Therefore, device and API heterogeneity lead to the fragmentation and disorientation of both researchers and developers. In view of all these considerations, there is a clear need for an authoring tool that can build haptic applications while hiding programming details from the application modeler (such as API, device, or virtual model). This paper describes the technical development of the haptic Application Markup Language Authoring Tool (HAMLAT). It is intended to explain the design decisions used for developing HAMLAT and also provides an implementation road map, describing the source code of the project. B. Blender HAMLAT is based on the blender 1 software suite, which is an open-source 3D modeling package with a rich feature set. It has a sophisticated user interface which is noted for its efficiency and flexibility, as well as its supports for multiple file formats, physics engine, modem computer graphic rendeing and many other features. Because of blenders open architecture and supportive community base, it was selected as the platform of choice for development of HAMLAT. The open-source nature of Blender means HAMLAT can easily leverage its existing functionality and focus on integrating haptic features which make it a complete hap to-visual modeling tool, since developing a 3D modeling platform from scratch requires considerable development time and expertise in order to reach the level of functionality of blender. Also, we can take advantage of future improvements to blender by merging changes from its source code into the HAMLAT source tree. HAMLAT builds on existing Blender components, such as the user-interface and editing tools, by adding new components which focus on the representation, modification, and rendering of haptic properties of objects in a 3D scene. By using Blender as the basis for HAMLAT, we hope to develop a 3D haptic modeling tool which has the maturity and features of Blender combined with the novelty of haptic rendering. C. Project Goals As previously stated, the overall goal for the HAMLAT project is to produce a polished software application which combines the features of a modem graphic modeling tool with haptic rendering techniques. HAMLAT has the look and feel of a 3D graphical modeling package, but with the addition of features such as haptic rendering and haptic property descriptors. This allows artists, modelers, and developers to generate realistic 3D hapto-visual virtual environments. A high-level block diagram of HAMLAT is shown in Figure 1. It illustrates the flow of data in the haptic modeling. HAMLAT assists the modeler, or application developer, in building hap to-visual applications which may be stored in a database for later retrieval by another haptic application. By hapto-visual application we refer to any software which displays a 3D scene both visually and haptically to a user in a virtual setting. An XML file format, called HAML 2, is used to describe the 3D scenes and store the hapto-visual environments built by a modeler for later playback to an end user. Traditionally, building hapto-visual environments has required a strong technical and programming background. The task of haptically rendering a 3D scene is tedious since haptic properties must be assigned to individual objects in the scene and currently there are few high-level tools for accomplishing this task. HAMLAT bridges this gap by integrating into the HAML framework and delivering a complete solution for development of hapto visual applications requiring no programming knowledge. The remainder of the paper is organized as follows: in Section 2, we present the proposed architecture extensions and discuss design constraints. Section 3 describes the implementation details and how haptic properties are added and rendered within the blender framework. In section 4 we discuss related issues and future work avenues. II. SYSTEMOVERVIEWANDARCHITECTURE The blender design philosophy is based on three main tasks: data storage, editing, and visualization. According to the legacy documentation 3, it follows a datavisualize-edit development cycle for the 3D modeling A 3D scene is represented using data structures within the Blender architecture. The modeler views the scene, makes changes using the editing interface which directly modifies the underlying data structures, and then the cycle repeats. To better understand this development cycle, consider the representation of a 3D object in Blender. A 3D object may be represented by an array of vertices which have been organized as a polygonal mesh. Users may choose to operate on any subset of this data set. Editing tasks may include operations to rotate, scale, and translate the vertices, or perhaps a re-meshing algorithm to cleanup redundant vertices and transform from a quad to a triangle topology. The data is visualized using a graphical 3D renderer which is capable of displaying the object as a wireframe or as a shaded, solid surface. The visualization is necessary in order to see the effects of editing on the data. In a nutshell, this example defines the design philosophy behind blenders architecture. In blender, data is organized as a series of lists and base data types are combined with links between items in each list, creating complex scenes from simple structures. This allows data elements in each list to be reused, thus reducing the overall storage requirements. For example, a mesh may be linked by multiple scene objects, but the position and orientation may change for each object and the topology of the mesh remains the same. A diagram illustrating the organization of data structures and reuse of scene elements is shown in Figure 2. A scene object links to three objects, each of which link to two polygonal meshes. The meshes also share a common material property. The entire scene is rendered on one of several screens, which visualizes the scene. We adopt the Blender design approach for our authoring tool. The data structures which are used to represent objects in a 3D scene have been augmented to include fields for haptic properties (e.g., stiffness, damping); user interface components (e.g., button panels) which allow the modeler to change object properties have also been updated to include support for modifying the haptic properties of an object. Additionally, an interactive hapto-visual renderer has been implemented to display the 3D scene graphically and haptically, providing the modeler or artist with immediate feedback about the changes they make to the scene. A class diagram outlining the changes to the blender frame work is shown in Figure 3. Components which are pertinent to HAMLAT are shaded in gray. HAMLAT builds on existing blender sub-systems by extending them for haptic modeling purposes. The tactile and kinesthetic cues, which are displayed due to interaction with virtual objects, are typically rendered based on the geometry of the mesh. in this data type. Other scene components such as lamps, cameras, or lines are not intuitively rendered using force feedback haptic devices and are therefore not of current interest fo rhaptic rendering. An augmented version of the mesh data structure is shown in Figure 4. It contains fields for vertex and face data, plus some special custom data fields which allow data to be stored to/retrieved from disk and memory. We have modified this data type to include a pointer to a haptics data structure, which stores haptic properties such as stiffness, damping, and friction for the mesh elements (Figure 5). Similarly to the built-in graphical renderer, HAMLAT uses a custom rendlerer for displaying 3Ds scenes grphcal and haptcall, and is in end of the connectivity is given in the next section.IMPLEMENTATIONDataStructureBlender uses many different data structures to represent the various types of objects in a 3D scene a polygonal mesh contains the location and topology of vertices; a lamp contains colour and intensity values; and a camera object contains intrinsic viewing parameters.The Mesh data structure is used by the blender Blender renderer. This component is developed independently since haptical and graphical rendering must be performed simultaneously and synchronously. A simulation loop is used to update haptic rendering forces at a rate which maintains stability and quality. Where the force feedback vector which is displayed to the user is computed using the stiffness coefficient (variable name stiffness) for the object and x the |penetration depth (displacement) of the haptic proxy into an object. The stiffness coefficient has a range of 0,1,where 0 represents no resistance to deformation and 1 represents the maximum stiffness which may be rendered by the haptic device. The damping of an object defines its resistance to the rate of deformation due to some applied struct Hapticse force. It is typically rendered using the force equation:Haptic Properties In this section well briefly discuss the haptic properties which may currently be modeled using HAMLAT. It is important for the modeler to understand these properties and their basis for use in haptic rendering. The stiffness of an object defines how resistant it is to deformation by some applied force. Hard objects, such as a rock or table, have very high stiffness; soft objects, such as rubber ball, have low stiffness. The hardness-softness of an object is typically rendered using the spring-force equation: range of 0,1 and may be used to model viscous behaviour of a material. It also increases the stability of the haptic rendering loop ford stiff materials. The static friction (variable name stjriction) and dynamic friction (variable name dyjriction) coefficient are used to model the frictional forces experienced while exploring the surface of a 3D object. Static friction is experienced when the proxy is not moving over the objects surface, and an initial force must be used to overcome static friction. Dynamic friction is felt when the proxy moves across the surface, rubbing against it. Frictional coefficients also have a range of /0,1, with a value of 0 making the surface of a 3D object feel slippery and a value of 1 making the object feel very rough. B. Editing Blender uses a set of non-overlapping windows called spaces to modify various aspects of the 3D scene and its objects. Each space is divided into a set of areas and panels which are context aware. That is, they provide functionality based on the selected object type. For example, if a camera is selected the panel will display components which allow the modeler to change the focal length and viewing angle of the camera, but these components will not appear if an object of another type is selected. Figure 6 shows a screen shot of the button space which is used to edit properties for a haptic mesh. It includes user-interface panels which allow a modeler to change the graphical shading properties of the mesh, perform simple re-meshing operations, and to modify the haptic properties of the selected mesh. HAMLAT follows the context-sensitive behavior of Blender by only displaying the haptic editing panel when a polygonal mesh object is selected. In the future, this panel may be duplicated to support haptic modeling for other object types, such as NURB surfaces. The blender framework offers many user-interface components (e.g., buttons, sliders, pop-up menus) which may be used to edit the underlying data structures. The haptic properties for mesh objects are editable using sliders or by entering a float value into a text box located adjacent to the slider. When the value of the slider/text box is changed, it triggers an event in the blender windowing a unique identifier indicates he first pass renders the scene graphically, and the second pass renders it haptically. The second pass is required because the Open Haptics toolkit intercepts commands send to the OpenGL pipeline and uses them to display the scene using haptic rendering techniques. In this pass, the haptic properties of each mesh object are used much in the same way color and lighting are used by graphical rendering they define the type of material for each object. To save CPU cycles, the lighting and graphical material properties are excluded from the haptic rendering pass. Figure 7 shows source code which is used to apply the material properties during the haptic rendering pass. The haptic renderer is independent from the Blender framework in that it exists outside the original source code. However, it is still heavily dependent on Blender data structures and types. D. Scripting The Blender Python (BPy) wrapper exposes many of the internal data structures, giving the internal Python scripting engine may access them. Similar to the data structures used for representing mesh objects in the native Blender framework, wrappers allow user defined scripts to access and modify the elements in a 3D scene. An import script allows 3D scenes to be read from a HAML file and reproduced in the HAMLAT application; an export script allows 3D scenes to be written to a HAML file, including haptic properties, and used in other HAML applications. The BPy wrappers also expose the Blender windowing system. Figure 9 shows a panel which appears when the user exports a 3D scene to the HAML file format. It allows the user to specify supplementary information about the application such as a description, target hardware, and system requirements. These are fields defined by the HAML specification 2 and are included with the authored scene as part of the HAML file format. User-interface components displayed on this panel are easily extended to agree with the future revisions of HAML. Currently, HAMLAT supports basic functionality for modeling and rendering hapto-visual applications. Scenes may be created, edited, previewed, and exported as part of a database for use in by other hap to-visual applications, such as the HAML player 6. However, there is room for growth and in there are many more ways we can continue leveraging existing blender functionality. As per future work, we plan to extend HAMLAT to include support for other haptic platforms and devices. Currently, only the PHANTOM series of devices is supported since the interactive renderer is dependent on the Open Haptics toolkit 5. In order to support other devices, a cross-platform library such as Chai3D or haptik may be used to perform rendering. These libraries support force rendering for a large range of haptic hardware. Fortunately, due to the modularity of our implementation, only the interactive haptic rendering component need be altered for these changes. In addition to support multiple hardware platforms, a user interface component which allows the selection and configuration of haptic devices will be important. Most likely, this will be added as part of the user preferences panel in blender. Adding support for haptic devices as part of editing tasks is also a planned feature. This will allow the modeler to modify the shape, location, and other properties on in-scene objects. For example, the sculpting mode in Blender allows a user to manipulate the geometry of a 3D object using anatural interface, similar to reshaping a piece of clay. HAMLAT will build on this technology by allowing the modeler to manipulate the virtual clay using high DOF haptic interfaces. 中文譯文延長攪拌機:摘要-在本文中,我們目前的工作是拓展一個眾所周知的三維圖形建模-攪拌機,來支持觸覺建模和繪制。這種延長攪拌機命名為HAMLAT(觸覺應用標記語言創(chuàng)作工具) 。我們描述修改和添加攪拌器的源代碼,其中已使用創(chuàng)造HAMLAT此外,我們提出和討論設計的決定時所用的發(fā)展中的HAMLAT, 也是一個“路線圖”的實施 ,其中描述了攪拌器的源代碼的改變。最后,我們的結論是討論我們未來的發(fā)展及研究途徑。 關鍵詞-觸覺,HAM,圖形建模,攪拌器, 虛擬環(huán)境。 一介紹A. 動機 越來越多的通過觸覺的方式在人類-電腦的互動方式的應用造成了對新的工具的巨大的需求,這些新的工具可以幫助新手用戶寫作和編輯觸覺應用。目前,觸覺的應用發(fā)展過程是一個耗時的經歷,它需要編程知識。觸覺應用的復雜性,從一個事實,即觸覺應用組件(如觸覺的空氣污染指數,設備,該觸覺描寫算法等)需要互動圖形組件,以實現同步。 此外,一個缺少應用可能性,因為應用是緊耦合到特定的裝置必須使用其相應的空氣污染指數。因此,設備和空氣污染指數的異質性,導致兩個研究人員和開發(fā)人員分裂和迷失方向。在檢查所有需要考慮的事時,有對創(chuàng)作工具明確的需要,可以建立觸覺的應用, 也可以隱藏在應用程序建模的編程(如空氣污染指數,裝置,或虛擬模型) 。 本文介紹了技術發(fā)展的觸覺應用標記語言創(chuàng)作工具(HAMLAT)。它的用意是解釋設計決定用于發(fā)展HAMLAT,還提供了執(zhí)行“路線圖”的一個應用,描述該項目的源代碼。 B攪拌器HAMLAT是以攪拌器 1 軟件套件為基礎, 這是一個開放源碼的三維建模套件擁有豐富的功能集。它有一個先進的用戶界面,它以它的高效率和靈活性,以及它的支援多種檔案格式,物理引擎,調制解調器等功能出名。 由于攪拌器的開放式體系結構和支持共同的基礎,它被選定為發(fā)展ofhamlat平臺的首選。攪拌器開放資源的性質,意味著HAMLAT可以輕易地利用其現有的功能和集中討論相結合的特點,使其成為一個完整的觸覺-可視化建模工具,發(fā)展為一個三維建模平臺,從無到有,需要相
溫馨提示
- 1. 本站所有資源如無特殊說明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請下載最新的WinRAR軟件解壓。
- 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請聯系上傳者。文件的所有權益歸上傳用戶所有。
- 3. 本站RAR壓縮包中若帶圖紙,網頁內容里面會有圖紙預覽,若沒有圖紙預覽就沒有圖紙。
- 4. 未經權益所有人同意不得將文件中的內容挪作商業(yè)或盈利用途。
- 5. 人人文庫網僅提供信息存儲空間,僅對用戶上傳內容的表現方式做保護處理,對用戶上傳分享的文檔內容本身不做任何修改或編輯,并不能對任何下載內容負責。
- 6. 下載文件中如有侵權或不適當內容,請與我們聯系,我們立即糾正。
- 7. 本站不保證下載資源的準確性、安全性和完整性, 同時也不承擔用戶因使用這些下載資源對自己和他人造成任何形式的傷害或損失。
最新文檔
- 補腦產品宣傳課程設計
- 2025年服裝銷售工作計劃范文(2篇)
- 軟件課程設計日志
- 課程設計水果攪拌機
- 二零二五年度建筑廢棄物資源化利用施工總承包管理服務合同范本3篇
- 公司執(zhí)業(yè)質量管理制度范文(2篇)
- 2025年播音部工作計劃范例(2篇)
- 2025年度汽車修理廠與汽車后市場平臺合作服務合同3篇
- 機械設備安全裝置檢查維修保養(yǎng)制度模版(3篇)
- 中小學績效工資制度范文(2篇)
- 2024年度美團平臺商家入駐服務框架協議
- 2024至2030年四氯苯醌項目投資價值分析報告
- DB4511T 0002-2023 瓶裝液化石油氣充裝、配送安全管理規(guī)范
- 《肝衰竭診治指南(2024版)》解讀
- 2025年集體經濟發(fā)展計劃
- 房地產銷售主管崗位招聘筆試題及解答(某大型央企)2024年
- 足球D級教練員培訓匯報
- 巖溶區(qū)水文地質參數研究-洞察分析
- 大學體育與健康 教案全套 體育舞蹈 第1-16周
- 一年級數學練習題-20以內加減法口算題(4000道)直接打印版
- 施工作業(yè)安全管理規(guī)定(4篇)
評論
0/150
提交評論