Assignment 04
Processing point clouds
Deadline is 2023-01-23 17:00
Late submission? 10% will be removed for each day that you are late.
You’re allowed for this assignment to work in a group of 3 (and thus submit only one solution for the three of you). You are free to form a group yourself; if you’re looking for a partner let me know (Hugo), or let others know on Discord. Groups of two are allowed (but not recommended), working alone is not an option.
- Overview
- Step 1: Pick a team to get the AHN3 tile assigned to you
- Step 2: Download and prepare the dataset
- Step 3: Filter the ground
- Step 4: Create the DTM of your region, using two methods
- Step 5: Compare your two created DTMs
- Step 6: If you used thinning, would results be different?
- Step 7: To help with visualisation, extract isolines
- If you use Python
- If you use C++
- Good to know
- Marking
- What to submit and how to submit it
Overview
In this assignment you need to:
- automatically classify a 500mX500m region of the AHN3 point cloud into ground and non-ground using ground filtering
- make a raster DTM using 2 different methods
- compare your DTM rasters with the official AHN3 DTM raster
- experiment with different thinning methods and parameters
- to help with visualisation, extract isolines
You can do this assignment using either Python or C++, or a mix of both (one step with Python, another with C++). We recommend C++ because this will prepare you for the courses in the next two quarters (C++ is the language used in GEO1004 and GEO1016).
Do I need to write the code for this step? Here’s the answer:
- cropping the dataset: you can use a LAZ library (eg lazpy) to read the file, but the cropping is done with your own code. Writing is done with lazpy too.
- CSF: your own code 100%
- create the DTM: you can use startinpy (or CGAL) to interpolate, the rest is your code
- thinning: your own code
- extracting the isolines: your own code
- visualising the isolines: anything you want, QGIS or matplotlib are allowed
- comparison with offical AHN3 DTMs: you can use any library/software you want
In doubt? Just ask on discord.
Step 1: Pick a team to get the AHN3 tile assigned to you
Once you have your team, fill your 3 student numbers in the Google Sheet and I’ll assign you one AHN3 tile (each team gets a different tile).
Step 2: Download and prepare the dataset
- Download the tile you were assigned in LAZ format from PDOK
- Select a 500mX500m area in the tile that is most interesting (mix of buildings, terrains, forest, and some water), and provide its bounding box in the report
- Extract the points from your region to the format you want for further processing
Step 3: Filter the ground
Implement the cloth simulation filter (CSF) algorithm (book section 11.4.2).
Notice that you are not allowed to use the classification of the points! Just the geometry should be used! This is important!
Step 4: Create the DTM of your region, using two methods
- using the cloth above to create the DTM
- by classifying the points as ground + using Laplace interpolation
Gridded DTM should have a resolution of 0.5m.
You are allowed to use the startinpy Laplace function.
If you want to use C++ for this step: CGAL will have to be used, the natural neighbour method can be used this way (which is very similar to Laplace), and we give an example in the provided code on GitLab. (You can also just use Python for this step)
Step 5: Compare your two created DTMs
- with each other,
- and with the official one from PDOK (0.5m DTM AHN3).
How to compare them is left to you (you can use existing libaries and software), but highlight their differences and try to explain why they are different in the report.
Step 6: If you used thinning, would results be different?
Thin the points in your 500mX500m region (by keeping both about 50% and about 10% of the total points), with two different methods: do these change the resulting DTM much? Show some differences in the report.
This means that steps 3-4-5 need to be performed again.
Step 7: To help with visualisation, extract isolines
Pick one of the DTMs you created above, and write the code to extract isolines (at every 2m) and show the results in the report.
If you use Python
Then no starting code is provided, you are free to use the code we gave you for the assignments and the code you wrote for solving these.
To read LAZ files, use laspy.
If you use C++
Then we provide some code in the GitLab repository of the course (/hw/04/
) to help you, eg to read point clouds and write gridded terrains (in ASC format).
Good to know
- To speed up development and testing you can use thinning (ie only load a random subset of the input point cloud) or crop the input datasets. However your final results need to be created with the full pointcloud!
- you can inspect the LAS/LAZ files with CloudCompare.
- you do not need to produce one piece of software, each part can be performed with different code, written in different languages.
Marking
The following will be evaluated, each part refers to both the code (we will try to run it) and to the report (quality, completeness, analysis).
Criterion | Points |
---|---|
code working + README.txt | 2.0 |
CSF implementation | 3.0 |
preparing the dataset | 1.0 |
creating DTMs | 2.0 |
thinning | 0.5 |
isolines | 1.5 |
What to submit and how to submit it
You have to submit:
- All the code you wrote, clean it and comment it and make it usable by others (eg no fixed paths and no hard-coded parameters)
- a
README.txt
explaining how to run the code, what input it expects, and where it writes files (for example). Try not hardcoding names of files or parameters. - a report (about ~15-20 pages maximum) in PDF (no Word file please) where you elaborate on:
- for each step, how did you perform it (in theory, I can read the code)
- present the main issues you had and how you fixed them
- the differences between your results and the official DTM of AHN3
- a section describe who did what in the team
Those files need to be zipped and the name of the file consists of the studentIDs of the members separated by a “_”, for example 5015199_4018169_4123169.zip
.
Upload the ZIP file to this surfdrive page (it doesn’t send a clear confirmation, but the name of the uploaded files is written).
[last updated: 2022-12-23 15:02]