From images to points to images: a step-by-step guide of digital aerial photogrammetry

by Brandan Gatz, Miguel Siguenza, and Yuhao Lu

last update: September 2024

Top to bottom: RGB orthoimage, near infrared, point cloud, vegetation (NDVI), and digital surface height model (DSM)

  • Click here and here to see models generated from this project. We recommend using a desktop browser (Chrome or Microsoft edge) and a mouse (not a trackpad).

  • Click here to access the manual (PDF, ~60MB).

  • Click here to get in touch if you have questions.

FAQ

  • This document is created for DJI Mavic 3 Multispectral. Students at the University of Manitoba can borrow this drone from the Fablab. We also recommend using an RTK (also available at the Fablab to enhance satellite connection.

  • We use Agisoft Metashape Pro. At the moment, we have one license for students to use (free of charge). In the near future, we plan to add another 5 floating licences. You can use Agisoft Metashape in both PC and Mac computers.

    We chose Metashape for its relatively low cost (one time payment with generous education discount) and massive online support from users around the world.

  • Certainly. While this document was created using DJI Mavic 3 Multispectral, principles and processing techniques can also be applied to other more affordable units such as Mavic 3 Mini. The scan and image products however might not be as accurate and complete (e.g., Mavic 3 mini does not have multispectral sensor).

  • The largest site we scanned was just under 3km^2 and it took ~5 hours. While we have enough batteries to cover larger sites, the trade-off you need to be aware of is lighting condition. Larger scans mean longer mission time with more variations in lighting conditions.

  • We recommend a PC with 16GB+ RAM and a decent graphic card and at least 50GB storage. Our PCs in CADlab has sufficient computing power to handle the processing.

  • No. Digital aerial photogrammetry is a passive remote sensing and it creates a 3D scene (point cloud and mesh) by using images taken by an optical camera from multiple angles (multi-view stereo matching). A laser scanner is an active remote sensing and it emits laser beams and records the position based on the reflected laser pulses.

  • Most optical camera sensors are only sensitive to the visible part of the spectrum. A multispectral sensor can see beyond RGB. The added data allow us to see more variations and subtle changes of various surface materials, particularly vegetation.

  • The 3D models can be exported to common file formats for 3D printing and post modelling in programs such as Rhino.

    The point cloud can be saved as .las which are readable by CloudCompare and common GIS programs.

    The 2D images are geoTiff files. These files are orthorectified and georeferenced. You can open them in QGIS, ArcGIS, and other common GIS software.

  • Please reach out to us if you are in Winnipeg. We will try our best to support our students, NGOs, and local communities.

  • We used potree to convert our point cloud to web-friendly content. It loads millions of points in seconds. potree is open source and free to use for everyone.

    Compute Canada hosts our web server and its content.

  • We need to further validate the positional accuracy of the scan to ensure the quality of derived products such as terrain models, tree height etc.

    We are also interested in using the multispectral images to identify common tree species or genus based on structural (point cloud) and spectral data (near infrared). If you are interested in collaborating, please get in touch!

  • DAP method can be sensitive to weather (wind) and lighting conditions. Shadows and blurred images will impact the quality of the scan. In addition, given the nature of the scanning methods, only surface materials are visible to the sensor(s). Terrains under dense vegetation or undergrowth for example can be challenging to capture.

Acknowledgement: We thank the URA, Fablab, the Faculty of Architecture at the Univeristy of Manitoba, SIEF grant (and its student committee members), and Compute Canada, for supporting this project. We also appreciate all the feedback and enthusiasm from everyone that stopped by and listened to our presentations.

Previous
Previous

Tree maps