Nvidia RTX 2080 Ti

CAD Workstations, Network Attached Storage [NAS], Servers, Monitors, SSD`s, Imaging Systems, Field Kit: Targets & Tripods etc.
Post Reply
fillway
I have made 30-40 posts
I have made 30-40 posts
Posts: 31
Joined: Thu Sep 06, 2018 6:51 pm
5
Full Name: Philip Weyhe
Company Details: 3DGS
Company Position Title: CTO
Country: USA
Linkedin Profile: Yes
Has thanked: 1 time
Contact:

Nvidia RTX 2080 Ti

Post by fillway »

Does anyone know of the applications of Ray Tracing and Laser Scanning? Given the similar nature of what RTX accomplishes I would imagine it could have some correlation...

Maybe in eliminating the noise generated from mirrors, windows, or water in the scanning process? The ability to trace a ray and notice it is bouncing off at a "weird" angle for example.

I have not done any research into what this technology does outside of making video games look prettier, so I might be completely off on that assumption. Curious to know if anyone more learned on the subject has any knowledge to add or possibly enlighten fellow scanners with.

Also any real world comparisons to the 1080 Ti or current top of the line workstation GPU's would be interesting to discuss here.
Philip Weyhe
CTO @ 3DGS.io
linkedin.com/in/fillway
www.3DGS.io
neeravbm
V.I.P Member
V.I.P Member
Posts: 136
Joined: Thu Mar 16, 2017 3:29 pm
7
Full Name: Neerav Mehta
Company Details: Indoor Intelligence
Company Position Title: CTO
Country: USA
Linkedin Profile: No
Has thanked: 1 time
Been thanked: 6 times

Re: Nvidia RTX 2080 Ti

Post by neeravbm »

They are not related and I don't expect ray-tracing technology in GPU to have any bearing on laser scanning process or post-processing.

As you mentioned, ray tracing is used to render a scene more realistically. Computationally, It shoots many rays of light from the camera and follows each ray of light in the presence of reflection, refraction, etc. Earlier ray-tracing used to be done in software and hence it was very slow. As a result this technology was only used in rendering static scenes or movies where you have lots of time to render a scene. It was not used in games, where the scene needs to be rendered in real-time. In RTX 2080 Ti, ray racing is done in hardware and hence is much faster and can potentially be done in real-time. As a result, now it can also be done in games. Hence I would expect the rendering quality of games to improve significantly if you have this graphic card and as long as the game has been optimized for it.

In laser scanning field, GPUs are used only during processing and not during capture. During processing, Nvidia's CUDA language is used to do thousands of operations in parallel. Ray-tracing by itself has no effect on the CUDA language or capabilities. So other than the gains achieved by increase in clock speed or on-chip RAM, I don't expect ray-tracing capability to have any effect on laser scan post-processing.
Neerav Mehta
CTO, Indoor Intelligence
Creators of http://scantobim.xyz and http://rep3d.com
User avatar
smacl
Global Moderator
Global Moderator
Posts: 1409
Joined: Tue Jan 25, 2011 5:12 pm
13
Full Name: Shane MacLaughlin
Company Details: Atlas Computers Ltd
Company Position Title: Managing Director
Country: Ireland
Linkedin Profile: Yes
Location: Ireland
Has thanked: 627 times
Been thanked: 657 times
Contact:

Re: Nvidia RTX 2080 Ti

Post by smacl »

I'm guess 2 x 1080ti cards would make more sense for the bulk of scanning that uses GPU. I'm not sure how useful the ray tracing would be beyond rendering meshed models in real time.
Shane MacLaughlin
Atlas Computers Ltd
www.atlascomputers.ie

SCC Point Cloud module
Wdigman - Europe
I have made 20-30 posts
I have made 20-30 posts
Posts: 23
Joined: Thu Jul 19, 2018 7:18 am
5
Full Name: Warren Digman
Company Details: Crystal Prism Consulting
Company Position Title: Owner
Country: Latvia - USA
Linkedin Profile: Yes
Been thanked: 2 times

Re: Nvidia RTX 2080 Ti

Post by Wdigman - Europe »

Its my understanding that these cards are specifically for gaming. Sure they will work for Cyclone, CAD, displaying point clouds in general, however you may be wasting your money.

For High end rendering and Revit / CAD work in 3d I read that the QUADRO series card are what "Professional" use. I have a machine with a Quadro 2200 (older card) and A machine that uses a GTX 750. When processing point clouds I don't think either card is being used. When exporting, unifying, filtering and registering, I can see in task manager that everything is working hard on the Processor, the hard drives if using mechanical click away.

Since so many video cards are for geared for gamers these days wanting 120 FPS, I don't see any benefit of an expensive video card for processing points. I may be wrong but that is a question for the engineers that wrote the code for the software you use.

Any software engineers from Leica of Faro are welcome to join in.

Personally I found that point processing software is CPU intensive, the more cores and the faster the processor is what matters. CPU, solid state drives, and more memory is also where you want to spend your money.

I have also noticed that some tasks are multi-threaded and some are not. Just depends on what you are doing.

Again any software engineers want to chime in I am game. Plan on building a new machine soon so I will be watching this thread.

I wonder how fast I can process e57 files in Recap using an Intel i9 (16 core processor) with 36 threads running.
User avatar
smacl
Global Moderator
Global Moderator
Posts: 1409
Joined: Tue Jan 25, 2011 5:12 pm
13
Full Name: Shane MacLaughlin
Company Details: Atlas Computers Ltd
Company Position Title: Managing Director
Country: Ireland
Linkedin Profile: Yes
Location: Ireland
Has thanked: 627 times
Been thanked: 657 times
Contact:

Re: Nvidia RTX 2080 Ti

Post by smacl »

Wdigman - Europe wrote: Fri Oct 26, 2018 10:41 amAgain any software engineers want to chime in I am game. Plan on building a new machine soon so I will be watching this thread.
As the owner of a small software house, I spend most of my hours writing point cloud software these days and watch what is going on with the development side of this industry. Development is changing rapidly and more developers are getting up to speed with multi-threading CPU algorithms and to a lesser extent, GPU programming. What I think you'll see as new releases of point cloud processing softwares emerge is that they'll increasingly use these technologies. Note that writing and debugging multi-threaded code is more difficult than single threaded code, particularly when refactoring existing code, and porting to the GPU more difficult still. As such, it is expensive and will appear in dribs and drabs. Buying a new workstation, the faster single threaded performance processors, such as the i9, will be fastest across for many existing applications and complex routines like meshing. As time goes on, more cores will become increasingly more important, so the likes of the Threadripper 1950 and 2950 will more likely dominate in newer and yet to be released software. For point clouds, effective GPU usage beyond just graphics will also start making its impact felt so GPUs with a big core count and memory will make a huge difference for certain tasks on certain software. nVidia cards are currently the better bet as the nVidia specific GPU language CUDA is currently very popular. I was helping a friend choose a decent but cost effecticve point cloud workstation last week and after going through the options we went with an Dell Alienware with Amd 1950 threadripper, 64gb RAM, twin 1080ti GPUs, 2tb SSD and 2x8tb hdd. If it was for myself, I'd probably build rather than buy, but for those looking for hardware support, Dell is a good option.
Shane MacLaughlin
Atlas Computers Ltd
www.atlascomputers.ie

SCC Point Cloud module
jedfrechette
V.I.P Member
V.I.P Member
Posts: 1236
Joined: Mon Jan 04, 2010 7:51 pm
14
Full Name: Jed Frechette
Company Details: Lidar Guys
Company Position Title: CEO and Lidar Supervisor
Country: USA
Linkedin Profile: Yes
Location: Albuquerque, NM
Has thanked: 62 times
Been thanked: 219 times
Contact:

Re: Nvidia RTX 2080 Ti

Post by jedfrechette »

At this point are there any scanning packages out there that make any significant use of GPU computing? Certainly the photogrammetry apps do, but that's a little bit different problem set.
Jed
User avatar
smacl
Global Moderator
Global Moderator
Posts: 1409
Joined: Tue Jan 25, 2011 5:12 pm
13
Full Name: Shane MacLaughlin
Company Details: Atlas Computers Ltd
Company Position Title: Managing Director
Country: Ireland
Linkedin Profile: Yes
Location: Ireland
Has thanked: 627 times
Been thanked: 657 times
Contact:

Re: Nvidia RTX 2080 Ti

Post by smacl »

jedfrechette wrote: Sat Oct 27, 2018 6:55 pm At this point are there any scanning packages out there that make any significant use of GPU computing? Certainly the photogrammetry apps do, but that's a little bit different problem set.
Hi Jed,

I think this is something that developers are only starting to get their heads around, I know that's certainly the case for me. Not sure where other software house are with this, but the latest version of SCC has its first bit of GPU code for real-time normal generation, as can be seen in the clip below. Plan is to enhance this to fully automated facade extraction over the next couple of months. My experience of GPU coding to date is that it takes more time and effort than multi-threading which takes more time and effort than single threaded code. Not all algorithms are easily multithreaded and not all of those are suitable for porting to the GPU. That said, I'm certainly planning to use it much more going forward and I'd imagine all the major vendors are doing likewise. FWIW, the 2080ti doesn't really offer much over the 1080ti here for general purpose programming on the GPU.


youtu.be/DfwCoT0z2OI
Shane MacLaughlin
Atlas Computers Ltd
www.atlascomputers.ie

SCC Point Cloud module
samik.pulsar
I have made <0 posts
I have made <0 posts
Posts: 2
Joined: Thu Oct 11, 2018 7:16 am
5
Full Name: Samik Dutta
Company Details: Pulsar Advanced Engineering
Company Position Title: business Development Manager
Country: India
Linkedin Profile: Yes

Re: Nvidia RTX 2080 Ti

Post by samik.pulsar »

From a visual effect standpoint, real-time ray tracing (RT) can provide photo-realistic effects by simulating real world lighting from definite light sources. Now dynamic lighting in games have been there for a while and RT is basically an evolution over traditional rendering methods of light. RT algorithms would be used more in upcoming games/movies/animations for realistic reflections, refraction, bloom effects, explosions and practically any interaction between light rays and an object. On the flip-side, this would take a lot of GPU computing power to get good frame rates in videogames, especially in the early days and developers would also need some time to adopt RT. As of now only 1 game supports it (Shadow of Tomb raider) and lots of unanswered question there.

Coming to laser scanning, RT has the potential of enhancing post processing and design works, as it takes a mathematically accurate approach towards lighting. Simulated walkthroughs and flythroughs would be more realistic and even much efforts in light rendering can be saved, if RT is applied. Still rendering from pointclouds can be enhanced with better lighting but since this technology is new, a definite cost benefit analysis is lacking. How quickly RT penetrates design and post processing (or if at all it does), that remains to be seen.
Nvidia has already unveiled RT based Quadro GPU line:
https://nvidianews.nvidia.com/news/nvid ... racing-gpu
AMD has an Open source RT platform called Radeon rays which is supported by their workstation GPU Radeon Pro WX8200.
And it's not just the hardware, applications would also depend on how practical the developers of design software deem RT to be and when do they wish to support it.
fillway
I have made 30-40 posts
I have made 30-40 posts
Posts: 31
Joined: Thu Sep 06, 2018 6:51 pm
5
Full Name: Philip Weyhe
Company Details: 3DGS
Company Position Title: CTO
Country: USA
Linkedin Profile: Yes
Has thanked: 1 time
Contact:

Re: Nvidia RTX 2080 Ti

Post by fillway »

My question was whether or not the increased efficiency in RT would/could be included either in scanning hardware or the early processing stages of scan data, eliminating certain time consuming aspects of later processing, such as minimizing or removing all together data distortion from mirrored, reflective, or glass surfaces.

My thought or hope was that since a single laser is a "ray" or beam of light, that somehow this technology would provide "better" if not quicker insight of what happens to that light when emitted from a scanner, thus improving data received during the scanning process.

Again, these questions arise mainly from the name of the technology rather than a real in-depth understanding of what it does, so I might be totally off base.
Philip Weyhe
CTO @ 3DGS.io
linkedin.com/in/fillway
www.3DGS.io
User avatar
smacl
Global Moderator
Global Moderator
Posts: 1409
Joined: Tue Jan 25, 2011 5:12 pm
13
Full Name: Shane MacLaughlin
Company Details: Atlas Computers Ltd
Company Position Title: Managing Director
Country: Ireland
Linkedin Profile: Yes
Location: Ireland
Has thanked: 627 times
Been thanked: 657 times
Contact:

Re: Nvidia RTX 2080 Ti

Post by smacl »

At a guess, hardware RT cores wouldn't be of much benefit to laser scanning as points produced by laser scanning are already effectively the results of tracing a single ray from the laser to the first point of incidence (and back again). From what I gather, the rays traced need to bounce of triangles for these cores. There's a good article on it here. That said, once the hardware is there, no doubt somebody will figure other uses for it. With respect to points from mirrors, it should be possible to remove them using cloud to cloud comparison from different setups, or other comparisons between sets of distances. This could certainly benefit from being written for the GPU though not RT cores specifically. The algorithms for static scanning and SLAM would be different and having the instrument position would help. If you've got a sample data set with these type of issues that you can share in E57, I'll have a look at it. It is worth remembering that the GPU is being under used already for general purpose processing, so us poor code monkeys have to get that sorted first for all sorts of processing :)
Shane MacLaughlin
Atlas Computers Ltd
www.atlascomputers.ie

SCC Point Cloud module
Post Reply

Return to “Other Hardware: Field & Back Office”