Basic 'workbench' for processing point cloud data

Please post all open source software related items here, eg MeshLab
Post Reply
sudo_ki
I have made 30-40 posts
I have made 30-40 posts
Posts: 33
Joined: Fri Mar 23, 2018 4:23 pm
Full Name: AJ Swanepoel
Company Details: -Unemployed-
Company Position Title: Survey nomad
Country: UK
Linkedin Profile: No

Basic 'workbench' for processing point cloud data

Post by sudo_ki » Fri Oct 19, 2018 7:49 pm

jedfrechette wrote:
Fri Oct 12, 2018 5:32 pm

I think one of the weaknesses of the software ecosystem in our industry is that developers have focused to much on trying to build solutions to specific problems rather than creating tools that make it easier for users to build their own solutions... The big vendors would like to think that everyone is doing the same types of scanning and producing the same deliverables. The astonishing variety of posts you see on this forum, however, clearly shows that simply is not true.
...

Build a GUI for PDAL. A minimum viable product would need to include a node based workspace for building filter pipelines and a 3D viewport to visualize the results...Tools that will work with any point cloud data I end up with regardless of source; airborne or mobile lidar, tripod scanner, hand-held scanner, photogrammetry, sonar...
It is my intention to start working towards what Jed describes; this has been on my mind now to quiet some time now and having someone else voice the same opinion has been motivating.

Things I think that are important to discuss; currently available libraries (eg. PDAL, PLC), toolkits for creating a GUI (eg. GTK+)

Are there any others out there who feel the same?

If so, i'd love to hear from you.

Scott
V.I.P Member
V.I.P Member
Posts: 758
Joined: Tue Mar 29, 2011 7:39 pm
Full Name: Scott Page
Company Details: Scott Page Design- Architectural service
Company Position Title: Owner
Country: USA
Linkedin Profile: No
Contact:

Re: Basic 'workbench' for processing point cloud data

Post by Scott » Fri Oct 19, 2018 10:20 pm

PyQtGraph
Scientific Graphics and GUI Library for Python
PyQtGraph is a pure-python graphics and GUI library built on PyQt4 / PySide and numpy. It is intended for use in mathematics / scientific / engineering applications. Despite being written entirely in python, the library is very fast due to its heavy leverage of numpy for number crunching and Qt's GraphicsView framework for fast display. PyQtGraph is distributed under the MIT open-source license.
http://www.pyqtgraph.org

https://pdal.io/about.html#what-is-pdal
What is PDAL?
PDAL is Point Data Abstraction Library. It is a C/C++ open source library and applications for translating and processing point cloud data. It is not limited to LiDAR data, although the focus and impetus for many of the tools in the library have their origins in LiDAR.
PCL
PCL is a complementary, rather than substitute, open source software processing suite for point cloud data. The developer community of the PCL library is focused on algorithm development, robotic and computer vision, and real-time laser scanner processing. PDAL links and uses PCL, and PDAL provides a convenient pipeline mechanism to orchestrate PCL operations.

neeravbm
V.I.P Member
V.I.P Member
Posts: 129
Joined: Thu Mar 16, 2017 3:29 pm
Full Name: Neerav Mehta
Company Details: Indoor Intelligence
Company Position Title: CTO
Country: USA
Linkedin Profile: No

Re: Basic 'workbench' for processing point cloud data

Post by neeravbm » Fri Oct 19, 2018 11:26 pm

First you have to think whether you want the application to be cloud-based (accessible via browser) or desktop-based.

If you decide that you want it to be desktop-based, the here's what we use for creating algorithms for Scan To BIM (https://scantobim.xyz) and Rep3D (http://rep3d.com) internally:

1) Back-end code is all C++. The advantage of C++ is that the compiled code is very fast and it's relatively easier to offload the processing to GPU using CUDA or HIP (https://github.com/ROCm-Developer-Tools/HIP). If you are starting from scratch on GPU, I recommend that you look into HIP since it's portable across AMD and Nvidia GPUs. We use OpenMP also extensively in C++.
2) We are using our internally customized version of PCL to make it more efficient.
3) For UI for internal development, we use Qt. For displaying point clouds, we use VTK, which is probably the only real choice you have for point cloud visualization. VTK and Qt are compatible with each other. I am not sure whether VTK and GTK+ play nice with each other or not.
4) We also use a few machine learning and optimization libraries but you probably don't require them.
Neerav Mehta
CTO, Indoor Intelligence
Creators of http://scantobim.xyz and http://rep3d.com

sudo_ki
I have made 30-40 posts
I have made 30-40 posts
Posts: 33
Joined: Fri Mar 23, 2018 4:23 pm
Full Name: AJ Swanepoel
Company Details: -Unemployed-
Company Position Title: Survey nomad
Country: UK
Linkedin Profile: No

Re: Basic 'workbench' for processing point cloud data

Post by sudo_ki » Sat Oct 20, 2018 2:08 am

neeravbm wrote:
Fri Oct 19, 2018 11:26 pm
For UI for internal development, we use Qt. For displaying point clouds, we use VTK, which is probably the only real choice you have for point cloud visualization.
Do you know how well Qt works with Vulkan compared to GTK+?

I wonder if anyone reading these has had experience with the Vulkan API; I'm pretty confident that I will want my workbench using it.

sudo_ki
I have made 30-40 posts
I have made 30-40 posts
Posts: 33
Joined: Fri Mar 23, 2018 4:23 pm
Full Name: AJ Swanepoel
Company Details: -Unemployed-
Company Position Title: Survey nomad
Country: UK
Linkedin Profile: No

Re: Basic 'workbench' for processing point cloud data

Post by sudo_ki » Sat Oct 20, 2018 10:39 am

I recently came across Mackie.jl which seems to have superseeded GLVisualize.jl (https://github.com/JuliaPlots/Makie.jl). Looks interesting...
PyQtGraph
This looks great, i think i need to do some further reading. http://vispy.org/ looks espcially interesting. I wonder how well either of these will handle 500 Million + nodes.
Last edited by sudo_ki on Mon Oct 22, 2018 8:19 pm, edited 1 time in total.

sudo_ki
I have made 30-40 posts
I have made 30-40 posts
Posts: 33
Joined: Fri Mar 23, 2018 4:23 pm
Full Name: AJ Swanepoel
Company Details: -Unemployed-
Company Position Title: Survey nomad
Country: UK
Linkedin Profile: No

Re: Basic 'workbench' for processing point cloud data

Post by sudo_ki » Mon Oct 22, 2018 8:18 pm

neeravbm wrote:
Fri Oct 19, 2018 11:26 pm
First you have to think whether you want the application to be cloud-based (accessible via browser) or desktop-based.
I'm pretty sure browser based is best. I'm going to start a Jupyter note book, which I will no doubt share here eventually.

Scott
V.I.P Member
V.I.P Member
Posts: 758
Joined: Tue Mar 29, 2011 7:39 pm
Full Name: Scott Page
Company Details: Scott Page Design- Architectural service
Company Position Title: Owner
Country: USA
Linkedin Profile: No
Contact:

Re: Basic 'workbench' for processing point cloud data

Post by Scott » Mon Oct 22, 2018 8:43 pm

sudo_ki wrote:
Mon Oct 22, 2018 8:18 pm
neeravbm wrote:
Fri Oct 19, 2018 11:26 pm
First you have to think whether you want the application to be cloud-based (accessible via browser) or desktop-based.
I'm pretty sure browser based is best. I'm going to start a Jupyter note book, which I will no doubt share here eventually.
What is Jupyter Notebook?
https://www.youtube.com/watch?v=q_BzsPxwLOE

User avatar
smacl
V.I.P Member
V.I.P Member
Posts: 174
Joined: Tue Jan 25, 2011 5:12 pm
Full Name: Shane MacLaughlin
Company Details: Atlas Computers Ltd
Company Position Title: Managing Director
Country: Ireland
Linkedin Profile: Yes
Location: Ireland
Contact:

Re: Basic 'workbench' for processing point cloud data

Post by smacl » Tue Oct 23, 2018 8:24 am

sudo_ki wrote:
Sat Oct 20, 2018 10:39 am
I wonder how well either of these will handle 500 Million + nodes.
Really one the first things you have to check with these SDKs is whether the performance will scale to meet your requirements. If you're looking at server versus desktop you also need to consider upload time to the server and costs of server side storage and bandwidth. I think desktop is far more effective than server for most processing activity and server wins out when sharing the finished product. Highest bandwidth currently still seems to be achieved by posting SSD drives.

User avatar
smacl
V.I.P Member
V.I.P Member
Posts: 174
Joined: Tue Jan 25, 2011 5:12 pm
Full Name: Shane MacLaughlin
Company Details: Atlas Computers Ltd
Company Position Title: Managing Director
Country: Ireland
Linkedin Profile: Yes
Location: Ireland
Contact:

Re: Basic 'workbench' for processing point cloud data

Post by smacl » Tue Oct 23, 2018 8:30 am

neeravbm wrote:
Fri Oct 19, 2018 11:26 pm
First you have to think whether you want the application to be cloud-based (accessible via browser) or desktop-based.

If you decide that you want it to be desktop-based, the here's what we use for creating algorithms for Scan To BIM (https://scantobim.xyz) and Rep3D (http://rep3d.com) internally:

1) Back-end code is all C++. The advantage of C++ is that the compiled code is very fast and it's relatively easier to offload the processing to GPU using CUDA or HIP (https://github.com/ROCm-Developer-Tools/HIP). If you are starting from scratch on GPU, I recommend that you look into HIP since it's portable across AMD and Nvidia GPUs. We use OpenMP also extensively in C++.
2) We are using our internally customized version of PCL to make it more efficient.
3) For UI for internal development, we use Qt. For displaying point clouds, we use VTK, which is probably the only real choice you have for point cloud visualization. VTK and Qt are compatible with each other. I am not sure whether VTK and GTK+ play nice with each other or not.
4) We also use a few machine learning and optimization libraries but you probably don't require them.
Great advice and very similar here, though HLSL for GPU as I'd not come across HIP which leaves us Microsoft centric. Must have a look at it as there are a few CUDA libraries that would come in handy but I've avoided as I didn't want to be nVidia specific. How do you find PCL scales with larger point clouds, it is some years since I looked at it?

jedfrechette
V.I.P Member
V.I.P Member
Posts: 840
Joined: Mon Jan 04, 2010 7:51 pm
Full Name: Jed Frechette
Company Details: Lidar Guys
Company Position Title: CEO and Lidar Supervisor
Country: USA
Linkedin Profile: Yes
Location: Albuquerque, NM
Contact:

Re: Basic 'workbench' for processing point cloud data

Post by jedfrechette » Tue Oct 23, 2018 5:09 pm

I specifically mentioned PDAL in my original post because I think its architecture is uniquely well suited to forming a basis for what I have in mind.

For pipelines that can be executed in streaming mode there are effectively no memory limitations on the size of data that can be processed and for operations that do need to load the entire data set in to memory it contains plenty of tiling and decimation options to make that manageable by the user based on the hardware they have available. I think the ability to mix proprietary and open source stages in the same pipeline is also very important, as it is not reasonable to expect that all tools and algorithms will be released as open source. Similarly, by abstracting each "Filter" as a separate stand-alone stage in the pipeline filter authors have a great deal of freedom in how those individual filters are implemented. Some filters might be written in Python, some might run on GPU, some might require exotic libraries that are only available on specific platforms.

At the same time PDAL's biggest weakness is acknowledged right at the beginning of its documentation:
PDAL doesn’t provide a friendly GUI interface, it expects that you have the confidence to dig into the options of Filters, Readers, and Writers
I think that statement undersells the weakness. Effectively users need to define a data flow graph by authoring JSON files. Even for expert users that's not ideal. Although I don't recall seeing it discussed on the mailing list, I'd be surprised if the core contributors haven't thought about what a GUI might look like. I would suggest checking in with the PDAL developers to see if they have any thoughts on how a GUI should be implemented.

smacl wrote:
Tue Oct 23, 2018 8:24 am
I think desktop is far more effective than server for most processing activity
I agree. There could certainly be value in preparing a long running process on a local workstation then sending it off to a local server farm to execute, but I think processes should be local first. The farther away from my workstation execution occurs the greater the costs in terms of security, transfer times, and storage. So far I haven't seen many compelling cloud processing services that overcome those increased costs so I don't have much interest in remote applications running inside my web browser.


We use Qt for our internal GUI development too. I haven't used it so I don't know if it is the best option, but this is the Qt library I have starred for building node graphs:

https://github.com/paceholder/nodeeditor


Even though I listed a 3D viewport as part of a minimum viable product in the original post, I don't think that it is actually needed to build a useful GUI for PDAL. A GUI node editor for authoring PDAL pipeline files would be useful even if those pipelines needed to be executed and the results viewed externally. Nonetheless, there are a few options to consider for a 3D viewport.

VTK seems like a reasonable choice and I used it for some small point cloud stuff several years ago. ParaView is built on VTK and is designed for massive data sets so it should be able to scale. Velodyne also has a basic application for viewing their lidar data that is built on top of ParaView.

CloudCompare is also an obvious place to look at to see how they handle point cloud rendering. I haven't dug in to it, but I believe they are mostly using builtin Qt libraries with their own octree acceleration structure to store the point data.

Thinkbox used the Ogre3D game engine for the 3D viewport when they built Sequoia so that could be another option to look at.

Regardless of the actual library used to do the rendering, I think the acceleration structure used underneath to feed the right points at the right time to the viewport is more important.
Jed

Post Reply

Return to “Open Source Software”