Zee, R. van der (1999) Geometric Simplification Algorithms for Surfaces: Enhancing KLSDecimate. Master's Thesis / Essay, Computing Science.
|
Text
Infor_Ma_1999_RvanderZee.CV.pdf - Published Version Download (796kB) | Preview |
Abstract
Throughout the years the size of datasets has increased so much that problems arise for their visualization. One of those problem areas is the ability to render 3D datasets. The complexity of the datasets increased much faster than the performance of the rendering hardware. Datasets of models used for medical or geographical applications contain up to millions of polygons.The datasets are too complex for interactive visualization and manipulation.To manipulate and visualize datasets of this complexity we have to reduce their size.The current hardware to visualize datasets interactively is able to process big datasets, but is also very expensive. The more affordable hardware, for desktop use in offices and at home, is not able to visualize these datasets interactively. The bandwidth of the current available networks is a limiting factor too. Accessing data stored in huge databases, like CAT-scans made in hospitals, can only be done through local area networks and public telephone lines whose bandwidths are limited.Although the speed of 3D-hardware graphics engines will probably increase by a large factor in the coming years, the size of datasets will also increase along with the advancement of technology. Algorithms have been developed to reduce the size of datasets without loosing too much detail. These algorithms are called decimation algorithms. In this field, people are searching for algorithms which maintain the quality of the original as much as possible, while reducing the complexity of the dataset. In this report I will discuss a decimation algorithm which reduces the complexity of a dataset which is a representation of a surface, consisting of triangles. For example, the surface of a teapot, a dataset which stores the surface (the height at evenly spaced points) of a terrain, or the surface of a human skull generated by the marching cubes algorithm. The algorithm discussed in this report is a decimation algorithm by Klein,Liebich and Straf3er, as discussed in [KLS96]. The algorithm reduces the complexity of the dataset by removing points from the dataset which satisfy a certain criterion. This algorithm gives better results than several other algorithms by using a special error measure called the 'Hausdorff distance'. I will discuss the algorithm in chapter 3. The algorithm has already been implemented by A. Noord and R.M. Aten as they have reported in [NA98]. This implementation however, can be improved at several points. I will discuss some improvements of their implementation, and describe how this changes the implementation. Both changes with respect to speed as well as functionality are presented and implemented.
Item Type: | Thesis (Master's Thesis / Essay) |
---|---|
Degree programme: | Computing Science |
Thesis type: | Master's Thesis / Essay |
Language: | English |
Date Deposited: | 15 Feb 2018 07:29 |
Last Modified: | 15 Feb 2018 07:29 |
URI: | https://fse.studenttheses.ub.rug.nl/id/eprint/8836 |
Actions (login required)
View Item |