MeshLab

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search
MeshLab
MeshLab LOGO.png
MeshLab 1.2.1
MeshLab 1.2.1
Developer(s)ISTI - CNR
Stable release
2016.12 / December 23, 2016; 23 months ago (2016-12-23)[1]
Repository Edit this at Wikidata
Written inC++, JavaScript
Operating systemCross-platform
TypeGraphics software
LicenseGPL
Websitewww.meshlab.net
www.meshlabjs.net

MeshLab is a 3D mesh processing software system that is oriented to the management and processing of unstructured large meshes and provides a set of tools for editing, cleaning, healing, inspecting, rendering, and converting these kinds of meshes. MeshLab is free and open-source software, subject to the requirements of the GNU General Public License (GPL), version 2 or later, and is used as both a complete package and a library powering other software. It is well known in the more technical fields of 3D development and data handling.

MeshLab is developed by the ISTI - CNR research center; initially MeshLab was created as a course assignment at the University of Pisa in late 2005. It is a general-purpose system aimed at the processing of the typical not-so-small unstructured 3D models that arise in the 3D scanning pipeline.

The automatic mesh cleaning filters includes removal of duplicated, unreferenced vertices, non-manifold edges, vertices, and null faces. Remeshing tools support high quality simplification based on quadric error measure, various kinds of subdivision surfaces, and two surface reconstruction algorithms from point clouds based on the ball-pivoting technique and on the Poisson surface reconstruction approach. For the removal of noise, usually present in acquired surfaces, MeshLab supports various kinds of smoothing filters and tools for curvature analysis and visualisation.

It includes a tool for the registration of multiple range maps based on the iterative closest point algorithm. MeshLab also includes an interactive direct paint-on-mesh system that allows users to interactively change the color of a mesh, to define selections and to directly smooth out noise and small features.

MeshLab is available for most platforms, including Linux, Mac OS X, Windows and, with reduced functionality, on Android and iOS and even as a pure client-side JavaScript application called MeshLabJS. The system supports input/output in the following formats: PLY, STL, OFF, OBJ, 3DS, VRML 2.0, U3D, X3D and COLLADA. MeshLab can also import point clouds reconstructed using Photosynth.

MeshLab is used in various academic and research contexts, like microbiology,[2] cultural heritage,[3] surface reconstruction,[4] paleontology,[5] for rapid prototyping in orthopedic surgery,[6] in orthodontics,[7] and desktop manufacturing.[8]

Additional images[edit]

See also[edit]

References[edit]

  1. ^ "MeshLab 2016.12 release notes". Official github repository.
  2. ^ Berejnov, V.V. (2009) [Submitted on 13 Apr 2009]. "Rapid and Inexpensive Reconstruction of 3D Structures for Micro-Objects Using Common Optical Microscopy" (PDF). arXiv:0904.2024. Bibcode:2009arXiv0904.2024B.
  3. ^ Remondino, F.; Menna, F. (2008). "Image-based surface measurement for close-range heritage documentation" (PDF) (PDF). The International Archives of the Photogrammetry. Retrieved 28 April 2017.
  4. ^ Xu, S.; Georghiades, A.; Rushmeier, H.; Dorsey, J. (2006). "Image guided geometry inference" (PDF). 3D PVT Symposium.
  5. ^ Abel, R. L.; et al. (Aug 2011). "Digital preservation and dissemination of ancient lithic technology with modern micro-CT". Computers & Graphics (PDF). Elsevier. 35 (4): 878–884. doi:10.1016/j.cag.2011.03.001.
  6. ^ Frame, M.; Huntley, J. S. (2012). "Rapid Prototyping in Orthopaedic Surgery: A User's Guide". The Scientific World Journal. 2012: 838575. doi:10.1100/2012/838575. PMC 3361341. PMID 22666160.
  7. ^ Harjunmaa, E.; Kallonen, A.; Voutilainen, M.; et al. (15 March 2012). "On the difficulty of increasing dental complexity". Nature. 483 (7389): 324–327. Bibcode:2012Natur.483..324H. doi:10.1038/nature10876. PMID 22398444.
  8. ^ "Desktop Manufacturing". Make. Jan 2010. p. 73.

External links[edit]