Help- Point Cloud creation with larger than our normal data set
-
- I have made 10-20 posts
- Posts: 17
- Joined: Thu Feb 21, 2019 9:18 pm
- 4
- Full Name: thiep nguyen
- Company Details: government
- Company Position Title: scanner
- Country: usa
- Linkedin Profile: No
- Has thanked: 12 times
- Been thanked: 4 times
Help- Point Cloud creation with larger than our normal data set
I have a project that is about 140 scans. This is the largest project we have tackled to date. Most of our projects are small room/compartments that typically are no more than 40 scans. Our scan settings are about 1/5 @ 4x (28 mpts).
When creating the point cloud, we are limiting the distance and also have "eliminate duplicate settings" set to high. Our Systems are Core -i9 processors running at 3GHZ with 128gb RAM. Our graphics cards are Quadro RTX 6000's. We are running off of 2 5TB hard drives with one of them set as scratch disk.
Here is the question.
How is it that I see many people on here working on projects with 500 - 700 scans but this 140 scan project is stressing my system so much that it is not able to create a point cloud after multiple days and crashes? What am I missing?
Any insight would help.
When creating the point cloud, we are limiting the distance and also have "eliminate duplicate settings" set to high. Our Systems are Core -i9 processors running at 3GHZ with 128gb RAM. Our graphics cards are Quadro RTX 6000's. We are running off of 2 5TB hard drives with one of them set as scratch disk.
Here is the question.
How is it that I see many people on here working on projects with 500 - 700 scans but this 140 scan project is stressing my system so much that it is not able to create a point cloud after multiple days and crashes? What am I missing?
Any insight would help.
- Jason Warren
- Administrator
- Posts: 4190
- Joined: Thu Aug 16, 2007 9:21 am
- 16
- Full Name: Jason Warren
- Company Details: Laser Scanning Forum Ltd
- Company Position Title: Co-Founder
- Country: UK
- Skype Name: jason_warren
- Linkedin Profile: No
- Location: Retford, UK
- Has thanked: 421 times
- Been thanked: 222 times
- Contact:
Re: Help- Point Cloud creation with larger than our normal data set
Are you using SSD`s or NVMe`s?
Jason Warren
Co_Founder
Dedicated to 3D Laser Scanning
LaserScanningForum
Co_Founder
Dedicated to 3D Laser Scanning
LaserScanningForum
-
- I have made 10-20 posts
- Posts: 17
- Joined: Thu Feb 21, 2019 9:18 pm
- 4
- Full Name: thiep nguyen
- Company Details: government
- Company Position Title: scanner
- Country: usa
- Linkedin Profile: No
- Has thanked: 12 times
- Been thanked: 4 times
Re: Help- Point Cloud creation with larger than our normal data set
Our OS runs off of NVME. Our 2 5 terrabytes hardrives are ssd.
- smacl
- Global Moderator
- Posts: 1358
- Joined: Tue Jan 25, 2011 5:12 pm
- 12
- Full Name: Shane MacLaughlin
- Company Details: Atlas Computers Ltd
- Company Position Title: Managing Director
- Country: Ireland
- Linkedin Profile: Yes
- Location: Ireland
- Has thanked: 601 times
- Been thanked: 618 times
- Contact:
Re: Help- Point Cloud creation with larger than our normal data set
My stock answer to this is to do a bit of profiling to see which resources are being maxxed out when you're processing a large job. Easiest way to do this is to fire up task manager when your system is under stress and see which resource (CPU, memory, disk, GPU, network) is coming under pressure. Not a Scene user myself, but more generally, I've seen various security, antivirus and backup solutions play havoc with performance on software that is heavy on disk access. Performance pane on task manager will tell you which resource is under pressure, processes pane will tell you which program is the culprit. It should be Scene, but if not, there may be other background software that is being so zealous it is killing performance (Bullguard, Kaspersky, Norton etc... need to be configured for heavy duty processing workloads). For more in-depth performance analysis have a look at perfmon,tipwynn wrote: ↑Tue Apr 25, 2023 2:31 pm I have a project that is about 140 scans. This is the largest project we have tackled to date. Most of our projects are small room/compartments that typically are no more than 40 scans. Our scan settings are about 1/5 @ 4x (28 mpts).
When creating the point cloud, we are limiting the distance and also have "eliminate duplicate settings" set to high. Our Systems are Core -i9 processors running at 3GHZ with 128gb RAM. Our graphics cards are Quadro RTX 6000's. We are running off of 2 5TB hard drives with one of them set as scratch disk.
Here is the question.
How is it that I see many people on here working on projects with 500 - 700 scans but this 140 scan project is stressing my system so much that it is not able to create a point cloud after multiple days and crashes? What am I missing?
Any insight would help.
- Kruse
- V.I.P Member
- Posts: 314
- Joined: Tue Jul 27, 2021 3:09 pm
- 2
- Full Name: Eric Kruse
- Company Details: Hensel Phelps - Construction
- Company Position Title: VDC Manager
- Country: United States
- Linkedin Profile: No
- Has thanked: 69 times
- Been thanked: 170 times
Re: Help- Point Cloud creation with larger than our normal data set
While I'm not a SCENE user anymore, take a look at this help page from FARO's database.tipwynn wrote: ↑Tue Apr 25, 2023 2:31 pm I have a project that is about 140 scans. This is the largest project we have tackled to date. Most of our projects are small room/compartments that typically are no more than 40 scans. Our scan settings are about 1/5 @ 4x (28 mpts).
When creating the point cloud, we are limiting the distance and also have "eliminate duplicate settings" set to high. Our Systems are Core -i9 processors running at 3GHZ with 128gb RAM. Our graphics cards are Quadro RTX 6000's. We are running off of 2 5TB hard drives with one of them set as scratch disk.
Here is the question.
How is it that I see many people on here working on projects with 500 - 700 scans but this 140 scan project is stressing my system so much that it is not able to create a point cloud after multiple days and crashes? What am I missing?
Any insight would help.
https://knowledge.faro.com/Software/FAR ... erformance
Lots of good info in there to help trouble shoot and maximise performance. One thing I've noticed in the past is to ensure the GPU is the default for your application. It can be changed through the Nvidia control panel. Also install/check Nvidia GeForce Experience for GPU driver updates. When stuff goes haywire or slow for me, that's been the culprit in the past.
- ProCro
- V.I.P Member
- Posts: 149
- Joined: Thu Jun 27, 2019 2:00 pm
- 4
- Full Name: Nino Skupnjak
- Company Details: SKIMI64
- Company Position Title: procurator
- Country: Croatia
- Linkedin Profile: Yes
- Location: Croatia
- Has thanked: 7 times
- Been thanked: 30 times
- Contact:
Re: Help- Point Cloud creation with larger than our normal data set
When you're about to create project point cloud, on bottom of dialog box, there should be info if there is enough space for project point cloud and temp folder on drive(s). I'm doing far larger project with i9, 128GB RAM and GTX1080. Even on laptop (i9 12900H, 64GB RAM, RTX 3070ti).
If there is not enough space on system drive for temp, you can relocate it on different drive (in settings).
If there is not enough space on system drive for temp, you can relocate it on different drive (in settings).
-
- I have made 60-70 posts
- Posts: 66
- Joined: Wed May 20, 2015 1:44 pm
- 8
- Full Name: Dustin Manning
- Company Details: luxpoint
- Company Position Title: Owner
- Country: United States
- Linkedin Profile: Yes
- Has thanked: 17 times
- Been thanked: 13 times
Re: Help- Point Cloud creation with larger than our normal data set
I've found that choosing to eliminate duplicate points on that high of a setting will increase the time to process exponentially. Use the default maybe?
-
- I have made 60-70 posts
- Posts: 66
- Joined: Wed May 20, 2015 1:44 pm
- 8
- Full Name: Dustin Manning
- Company Details: luxpoint
- Company Position Title: Owner
- Country: United States
- Linkedin Profile: Yes
- Has thanked: 17 times
- Been thanked: 13 times
Re: Help- Point Cloud creation with larger than our normal data set
Also, what software are you going to use your point cloud in after it's complete?
-
- I have made 10-20 posts
- Posts: 17
- Joined: Thu Feb 21, 2019 9:18 pm
- 4
- Full Name: thiep nguyen
- Company Details: government
- Company Position Title: scanner
- Country: usa
- Linkedin Profile: No
- Has thanked: 12 times
- Been thanked: 4 times
Re: Help- Point Cloud creation with larger than our normal data set
After we complete the pointcloud. It is then transferred to our engineering department and they primarily use SceneLT and Recap.
Last edited by tipwynn on Wed Apr 26, 2023 2:51 pm, edited 1 time in total.
-
- I have made 10-20 posts
- Posts: 17
- Joined: Thu Feb 21, 2019 9:18 pm
- 4
- Full Name: thiep nguyen
- Company Details: government
- Company Position Title: scanner
- Country: usa
- Linkedin Profile: No
- Has thanked: 12 times
- Been thanked: 4 times
Re: Help- Point Cloud creation with larger than our normal data set
For comparable project, how long does it take typically take to build a point cloud of that size?ProCro wrote: ↑Wed Apr 26, 2023 7:35 am When you're about to create project point cloud, on bottom of dialog box, there should be info if there is enough space for project point cloud and temp folder on drive(s). I'm doing far larger project with i9, 128GB RAM and GTX1080. Even on laptop (i9 12900H, 64GB RAM, RTX 3070ti).
If there is not enough space on system drive for temp, you can relocate it on different drive (in settings).