Computer Specs, Workflows and optimization.

Total Stations, CAD Workstations, SSD`s, Imaging Systems, Targets & Tripods Etc.
woodys14a
I have made <0 posts
I have made <0 posts
Posts: 1
Joined: Fri Feb 03, 2017 3:14 pm
Full Name: Aaron Wood
Company Details: Berry And Escott Engineering
Company Position Title: Design And Project Engineer
Country: UK
Linkedin Profile: Yes

Computer Specs, Workflows and optimization.

Post by woodys14a » Tue Mar 12, 2019 2:15 pm

Hello guys,

I work for small engineering company in somerset UK, where for a little over 2 years now we have been working with point cloud data captured with a Faro Focus X130. primarily we use this data to take as built factories, plants and equipment to carry out our design work in Solidworks and then either drop the models in for clash/ fit detection or build up a CAD model of the existing to work from.

We are finding that more and more scan work is being carried and we are dealing with larger data sets. for example the latest data set i was working with was some 350 million points after deleting out data and equipment that was no longer required, this scan data was carried out on a slightly higher resolution and less decimation to what we would typically use as the higher level of precision is required.

My main question is and I know this topic has been covered often as I have searched, but what computer specs should I be looking at to handle this sized data.
our work flow is Recap Pro -> Navisworks and Solidworks - > Navisworks to allow the Solidworks model to be overlaid in the point cloud and accurately take measurements.

stitching of the scans is fine as I can leave the PC indexing the scans and its reasonably good while working on something else but when I come to update the Solidworks model in Navisworks by refreshing the referenced file location it is taking some 10 mins each time, and when making many small changes can become very problematic.

For reverse engineering say the floors and levels of the building I have been trialing Xtract3D add in for solidworks allowing to get around the 250,000 point limit. This worked fine on smaller point clouds (20 million points), but with the bigger clouds i reach a memory limit and Solidworks crashes immediately.

currently I use;
- Dell Precision 7810
- Intel Xeon E5-2630 2.40GHz 8 core
- 32GB RAM
- Quadro M2000
- SM961 NVMe SAMSUNG 512GB

As I am not fully clued up on PC performance and optimized components I don't wont to suggest to the company go and spend thousands on something that is not going to improve the operation.

Any recommendations will be appreciated or if you work with these sorts of data sets sharing your specification will be beneficial.

Also as we use Solidworks (wont get away from this) it only uses a single core for processing, where I know the current one is not the best as found with large assemblies so we need almost the best of both worlds in terms of processors.

Thanks Aaron :D

Return to “Other Hardware”