System Benchmarking

To chat about anything else.
Post Reply
User avatar
Carbix
I have made 90-100 posts
I have made 90-100 posts
Posts: 91
Joined: Sat Mar 16, 2019 1:36 am
Full Name: Daniel Loney
Company Details: Excelsior Measuring
Company Position Title: Owner
Country: Canada
Linkedin Profile: Yes
Location: Canada
Has thanked: 8 times
Been thanked: 9 times
Contact:

System Benchmarking

Post by Carbix »

I think it time we come up with our own benchmark. Something that we can test out systems out to see if they are working to the standards of others.

I would be happy to provide scans from an RTC360.

Something with 3-5 scans in it. I would think it should take at least 30min of run time for import.

Thoughts? I know there are a lot of people on here with better understanding of point clouds and what makes them complex. I'm more of a hardware guy.
Daniel Loney - Owner
Excelsior Measuring inc.
Vancouver - Okanagan - Calgary
www.ExcelsiorLevel.com

User avatar
Daniel Wujanz
I have made 50-60 posts
I have made 50-60 posts
Posts: 58
Joined: Fri Aug 24, 2018 11:26 am
Full Name: Daniel Wujanz
Company Details: technet GmbH
Company Position Title: 3D Laser Scanning Specialist
Country: Germany
Linkedin Profile: Yes
Location: Berlin
Has thanked: 8 times
Been thanked: 18 times
Contact:

Re: System Benchmarking

Post by Daniel Wujanz »

Yes, but I have questions at first. Which parameters are you aiming for? Are trying to create a field test procedure (e.g. to check after an "uncertain" transport) or something that could be done in or around your office?

Cheers

Daniel

User avatar
smacl
Global Moderator
Global Moderator
Posts: 326
Joined: Tue Jan 25, 2011 5:12 pm
Full Name: Shane MacLaughlin
Company Details: Atlas Computers Ltd
Company Position Title: Managing Director
Country: Ireland
Linkedin Profile: Yes
Location: Ireland
Has thanked: 73 times
Been thanked: 76 times
Contact:

Re: System Benchmarking

Post by smacl »

Firstly, I think this is a great area of discussion but as Daniel points you need firstly to figure out a list of results that you're trying to obtain and why. For example are you looking at comparative accuracy between different pieces of equipment or more interested in drop in accuracy for a single piece of kit due to calibration being off? Are you looking at comparative results across different pieces of software as well as hardware? Are you able to hold test conditions invariant or is this a test you want to carry out at different sites?

The idea with testing such as bench marking is to figure out the variable your trying to measure/compare and isolate all the other potential confounding variables in play during the measurement process. Example of a variable is repeatable accuracy over a given distance. Examples of confounding variables here are reflectivity of surface, physical movement of the instrument due to vibration, weather etc... Examples of controlling for these confounding variables are to repeat the same scan from the same position indoors and have the scanner mounted on something not subject to vibration such as a concrete pillar. In terms of controlling absolute accuracy, the best way in my opinion is to introduce control using an independent method of measurement, i.e. total stations and leveling.

Thinking about cost and testing effort, you don't need all of this all the time if you're looking for gross calibration errors or coarse comparison between two instruments. Just a matter of trying to figure out what interference factors and confounding variables are in play and deal with them as best as resources allow.

JClay15
I have made <0 posts
I have made <0 posts
Posts: 5
Joined: Wed Aug 22, 2018 12:30 am
Full Name: Jeffrey Clay
Company Details: TWM
Company Position Title: Reality Capture Specialist
Country: United States
Linkedin Profile: Yes

Re: System Benchmarking

Post by JClay15 »

I believe what Daniel Loney may be referring to is seeing what kind of import times are different peoples systems taking. Is this correct David?

If so, I would be up for that.

User avatar
Carbix
I have made 90-100 posts
I have made 90-100 posts
Posts: 91
Joined: Sat Mar 16, 2019 1:36 am
Full Name: Daniel Loney
Company Details: Excelsior Measuring
Company Position Title: Owner
Country: Canada
Linkedin Profile: Yes
Location: Canada
Has thanked: 8 times
Been thanked: 9 times
Contact:

Re: System Benchmarking

Post by Carbix »

JClay15 wrote:
Tue Feb 18, 2020 2:56 am
I believe what Daniel Loney may be referring to is seeing what kind of import times are different peoples systems taking. Is this correct David?

If so, I would be up for that.
This one! lol

That said I get what Smacl is talking about. That one sounds like we would need a grant from the government.

My goal is to see what computer configuration works best. Even things like pulling from a server vs local NVMe raid. Is intel better or AMD. There are always hidden factors that wont come to light unless we compare our setups.

To do that we need a benchmark. A base. Something we can all run and should take a reasonable amount of time.
Daniel Loney - Owner
Excelsior Measuring inc.
Vancouver - Okanagan - Calgary
www.ExcelsiorLevel.com

dhirota
V.I.P Member
V.I.P Member
Posts: 596
Joined: Sun Nov 01, 2009 11:18 pm
Full Name: Dennis Hirota
Company Details: Sam O. Hirota Inc.
Company Position Title: President
Country: USA
Linkedin Profile: Yes
Location: Hawaii, USA
Has thanked: 8 times
Been thanked: 40 times

Re: System Benchmarking

Post by dhirota »

Carbix wrote:
Mon Feb 17, 2020 7:13 am
I think it time we come up with our own benchmark. Something that we can test out systems out to see if they are working to the standards of others.

I would be happy to provide scans from an RTC360.

Something with 3-5 scans in it. I would think it should take at least 30min of run time for import.

Thoughts? I know there are a lot of people on here with better understanding of point clouds and what makes them complex. I'm more of a hardware guy.
It depends if one is looking at current hardware systems that most people use; multi-core systems that have recently become available; operating systems (OS); internal networking, storage and memory systems; application programs and data set sizes. I believe that display size, type and speed maybe important on some situations. I mentioned this in a LSF thread several months ago. Improvement in throughput has been significant. As Jonathan Coco mentioned, maybe the 50 inch displays would have been better.

viewtopic.php?f=123&t=15539

All these have an impact on benchmark value.

We have been bench marking all of our processing using our 4 different lidar and imaging sensors; using Windows 10 Pro and Ubuntu 18.04 LTS; using 4 core, 6-core, 8-core, 18-core, and now 64-core platforms to check the benefits using many of the current applications. People on this LSF need to be innovative, understand their workflow, and improve from yesterday. We are continuously improving our field and office procedures, and the only way to accomplish that is to bench mark your work.

I would be willing to help develop and test any proposed bench marks.
Dennis Hirota, PhD, PE, LPLS
[email protected]

User avatar
Carbix
I have made 90-100 posts
I have made 90-100 posts
Posts: 91
Joined: Sat Mar 16, 2019 1:36 am
Full Name: Daniel Loney
Company Details: Excelsior Measuring
Company Position Title: Owner
Country: Canada
Linkedin Profile: Yes
Location: Canada
Has thanked: 8 times
Been thanked: 9 times
Contact:

Re: System Benchmarking

Post by Carbix »

dhirota wrote:
Tue Feb 18, 2020 8:31 am
It depends if one is looking at current hardware systems that most people use; multi-core systems that have recently become available; operating systems (OS); internal networking, storage and memory systems; application programs and data set sizes. I believe that display size, type and speed maybe important on some situations. I mentioned this in a LSF thread several months ago. Improvement in throughput has been significant. As Jonathan Coco mentioned, maybe the 50 inch displays would have been better.

viewtopic.php?f=123&t=15539

All these have an impact on benchmark value.

We have been bench marking all of our processing using our 4 different lidar and imaging sensors; using Windows 10 Pro and Ubuntu 18.04 LTS; using 4 core, 6-core, 8-core, 18-core, and now 64-core platforms to check the benefits using many of the current applications. People on this LSF need to be innovative, understand their workflow, and improve from yesterday. We are continuously improving our field and office procedures, and the only way to accomplish that is to bench mark your work.

I would be willing to help develop and test any proposed bench marks.
This would be very helpful if any of us are willing to share benching data. To start we need to come up with some file sets. Something that should take at least 30min to run. No less than 10min on the strongest of computer setups.

We should come up with a list of benchmarks to compare.

We may want to tailor them to each platform that we use.

So say for Register 360 we could look at the import time on 5 high res scans with no auto options. Than look at its time with any number of the option on. This could also be used for export to a given format like e57.

Next would be to see how fast it can be converted to an RCP.

Now thats my most common work flow and I'm sure a number of us use it. That said we should think up a few "Sections" to make up some "courses".

I think the raw scans should be downloadable in size. 20gb should be reasonable.

Also does anyone know of a tracking software to help us measure how long this takes... I dont feel like sitting at a computer watching the screen with a stop watch. That said I could turn on a screen recorder.
Daniel Loney - Owner
Excelsior Measuring inc.
Vancouver - Okanagan - Calgary
www.ExcelsiorLevel.com

User avatar
smacl
Global Moderator
Global Moderator
Posts: 326
Joined: Tue Jan 25, 2011 5:12 pm
Full Name: Shane MacLaughlin
Company Details: Atlas Computers Ltd
Company Position Title: Managing Director
Country: Ireland
Linkedin Profile: Yes
Location: Ireland
Has thanked: 73 times
Been thanked: 76 times
Contact:

Re: System Benchmarking

Post by smacl »

Carbix wrote:
Tue Feb 18, 2020 1:35 pm
Also does anyone know of a tracking software to help us measure how long this takes... I dont feel like sitting at a computer watching the screen with a stop watch. That said I could turn on a screen recorder.
The best approach here would probably to create some scripted tests that logged start and end times, either in-built scripts for programs that have that capability or external ones using a tool such as AutoIT, see https://www.autoitscript.com/site/ or shell scripts. On Windows you could also have Perfmon running in the background to log various resource usage, see https://www.thewindowsclub.com/how-to-u ... or-windows

I have something like this running full time in the office for software regression testing on SCC, i.e. continuously running various client data to make sure new functionality doesn't break existing workflows or cause performance glitches. I use a package called TestComplete for this which is excellent but not cheap.

Manual benchmarking is ok for a very occasional check but can get time consuming very quickly if you need repeat any way regularly.

User avatar
James Hall
V.I.P Member
V.I.P Member
Posts: 197
Joined: Tue Feb 02, 2010 5:13 pm
Full Name: James E Hall
Company Details: Precision Measurements Inc
Company Position Title: Survey Technician - Cyclone Modeler
Country: USA
Location: Chantilly, VA
Has thanked: 4 times
Been thanked: 28 times

Re: System Benchmarking

Post by James Hall »

First off I like this Idea.

Also, there are a few different work flows that are commonly used. Scanner to Cyclone Core, Scanner to Reg360, Scanner to Leica Field, Scanner to Recap, scanner to blk360 data manager to Cyclone Core and so on.

I not suggesting making an all inclusive list.
But categorize our tests based on scanner and software for import leaving the hardware variable.
I use a C-10 to Cyclone Core for most of my downloads and BLK360 to data Manager to Cyclone Core for the rest. This way I can back up my Raw files independently from the registration software.

So when we report Benchmark times I would run a test based on a standardized output from a particular scanner into a particular software.
I could also run any standardized data set from a scanner I'm evaluating for potential upgrades.

The criteria for a standardized data set from a scanner should be an average 3-5 high resolution scan setups with Images. We should also run up a torture test variant to put our storage solutions to the test. Say 50 scans run 4 times in a row. Something that will get up into the 200GB range.
Most storage solutions can buffer a certain amount of data. Once that buffer is full the transfer slows down.

We would also need to describe our hardware and configuration settings in a Standardized way to ease comparisons.

James

User avatar
Carbix
I have made 90-100 posts
I have made 90-100 posts
Posts: 91
Joined: Sat Mar 16, 2019 1:36 am
Full Name: Daniel Loney
Company Details: Excelsior Measuring
Company Position Title: Owner
Country: Canada
Linkedin Profile: Yes
Location: Canada
Has thanked: 8 times
Been thanked: 9 times
Contact:

Re: System Benchmarking

Post by Carbix »

I think we need to MVP this thing (minimum viable product). See how we all like it and move from there.

Should we start with raw hardware scans or say an e57 (or other format) that is not cleaned in any way. I personal think we should come up with a library of hardware scans that we all agree on. 3-5 scans with images if applicable. Should be of something interesting. I have about 30tb of raw RTC scans that I can pick from.

Lets make a list of the most commonly used scanners that put out a unique file. for instance an p30, p40, p50 all put out roughly (don't hold me to this) the same files. I would assume this is the same for most of the FARO line. The BLK360 might be a little different. I don't know what the C10 puts out.

I think each scan environment should be relative to what the hardware was designed for. No small room scans with a Pxx series. BLK360 should be small and close. RTC360 could be a mix of mid range to inside scans (I'm curious to see how foliage will affect this kind of stuff).

I have BLK360 and RTC360 Data. Who has C10, Pxx and FARO?
Daniel Loney - Owner
Excelsior Measuring inc.
Vancouver - Okanagan - Calgary
www.ExcelsiorLevel.com

Post Reply

Return to “General Chat”