Sounds like an absolute beast of a machine Dennis, I'll keep in touch with any up and coming developments. I'm currently at the tail end of a part time course in deep learning for computer vision so have been somewhat swamped but I think it will reap rewards in the longer term. Specifically, if we can align point clouds to HD images (as we do for RGB colorization) and we can use deep learning to identify objects in images, we theoretically ought to be able to use deep learning to classify objects within clouds and then replace those objects with discrete polygon models than get re-colored with the photography. There's quite a bit more to this but it is where I see point cloud processing going and it will certainly be compute intensive! Even some of the old school algorithms for detecting common geometries, such a cylinders for lamp posts really struggle on larger scans and would benefit hugely from being ported to a high-end GPU.dhirota wrote: ↑Sun Apr 14, 2019 11:32 pmShane: I have decided to spend a few US$ to help the economy and expand our research efforts by installing a EVGA Nvida RTX 2080Ti into our i9-9980XE, 18core/36thread, 128GB RAM workstation to test anything that you might produce (hopefully soon) as well as some other software.
Nvidia RTX 2080 Ti
- smacl
- Global Moderator
- Posts: 1409
- Joined: Tue Jan 25, 2011 5:12 pm
- 13
- Full Name: Shane MacLaughlin
- Company Details: Atlas Computers Ltd
- Company Position Title: Managing Director
- Country: Ireland
- Linkedin Profile: Yes
- Location: Ireland
- Has thanked: 627 times
- Been thanked: 657 times
- Contact:
Re: Nvidia RTX 2080 Ti
-
- V.I.P Member
- Posts: 958
- Joined: Sun Nov 01, 2009 11:18 pm
- 14
- Full Name: Dennis Hirota
- Company Details: Sam O Hirota Inc
- Company Position Title: President
- Country: USA
- Linkedin Profile: Yes
- Location: Hawaii, USA
- Has thanked: 87 times
- Been thanked: 379 times
Re: Nvidia RTX 2080 Ti
Shane: Your CV course is exactly where you need to be and I am looking forward to see what you learn from it.
viewtopic.php?f=39&t=12054
This was my response to questions from Eugene and Jonathan about recognizing images and objects. At that time I did not have the ability to process information using Ubuntu, but I now have a 36 thread system running Ubuntu and today got my RTX2080Ti running on a 4K monitor under Ubuntu 18.04 and and will be able to use my Intel USB 3.0 Neural Compute Stick enabling rapid prototyping, validation and deployment of Deep Neural Network (DNN) inference applications.
Two years ago, I posted this thread on the CVPR 2017 Conference in HNL.smacl wrote: ↑Mon Apr 15, 2019 6:04 pmSounds like an absolute beast of a machine Dennis, I'll keep in touch with any up and coming developments. I'm currently at the tail end of a part time course in deep learning for computer vision so have been somewhat swamped but I think it will reap rewards in the longer term. Specifically, if we can align point clouds to HD images (as we do for RGB colorization) and we can use deep learning to identify objects in images, we theoretically ought to be able to use deep learning to classify objects within clouds and then replace those objects with discrete polygon models than get re-colored with the photography. There's quite a bit more to this but it is where I see point cloud processing going and it will certainly be compute intensive! Even some of the old school algorithms for detecting common geometries, such a cylinders for lamp posts really struggle on larger scans and would benefit hugely from being ported to a high-end GPU.dhirota wrote: ↑Sun Apr 14, 2019 11:32 pmShane: I have decided to spend a few US$ to help the economy and expand our research efforts by installing a EVGA Nvida RTX 2080Ti into our i9-9980XE, 18core/36thread, 128GB RAM workstation to test anything that you might produce (hopefully soon) as well as some other software.
viewtopic.php?f=39&t=12054
This was my response to questions from Eugene and Jonathan about recognizing images and objects. At that time I did not have the ability to process information using Ubuntu, but I now have a 36 thread system running Ubuntu and today got my RTX2080Ti running on a 4K monitor under Ubuntu 18.04 and and will be able to use my Intel USB 3.0 Neural Compute Stick enabling rapid prototyping, validation and deployment of Deep Neural Network (DNN) inference applications.
For those that cannot see the image, it is bunch of fruit identified by contents. Will let you know if I get it running.dhirota wrote: ↑Thu Aug 10, 2017 1:41 am Eugene and Jonathan
I forgot to mention that the DJI Spark UAV is using the Intel Neural Compute Stick AI chip to track the hand motions to control the UAV by recognizing objects.
"The Neural Compute Stick enables rapid prototyping, validation and deployment of Deep Neural Network (DNN) inference applications at the edge. Its low-power VPU architecture enables an entirely new segment of AI applications that aren't reliant on a connection to the cloud. This allows deep learning developers to profile, tune, and deploy Convolutional Neural Network (CNN) on low-power applications that require real-time inferencing."
You need a x86_64 computer running Ubuntu 16.04 | USB 2.0 Type-A port (Recommend USB 3.0) | 1GB RAM | 4GB free storage space.
You do not have the required permissions to view the files attached to this post.
- smacl
- Global Moderator
- Posts: 1409
- Joined: Tue Jan 25, 2011 5:12 pm
- 13
- Full Name: Shane MacLaughlin
- Company Details: Atlas Computers Ltd
- Company Position Title: Managing Director
- Country: Ireland
- Linkedin Profile: Yes
- Location: Ireland
- Has thanked: 627 times
- Been thanked: 657 times
- Contact:
Re: Nvidia RTX 2080 Ti
Doing it more out of interest in the subject more than anything else if I'm honest. I've been running Atlas for three decades now so the last update to my CV probably talks about the wonders of MSDOS and how we can finally ditch CP/M
Great work Dennis, a friend of mine has a DJI spark so I must have a look at it in action. Have you looked at the Google open images library? Lots of training data there. From what I gather, man made objects with low variance don't need nearly as much data as objects with high variance. e.g. there are fewer types of lamp post and pillar than there are types of car, there are fewer types of car than of people. YOLO and Tensorflow both have easy interfaces to get up and running for real time applications, though both come up with bounding boxes rather then bounding polygons and the results can be weak without strong training data. For the data to be properly useful, having identified objects, you then need to estimate pose, which involves having training data with enough distinct key-points. If you go back to the point cloud from the bounding box, you could possibly derive bounding polygons from depth and camera oriented normals and use these in turn to derive pose. You could also do the same using a stereo pair of photographs. Most of what I've seen so far with CNNs and DNNs is based around 2D images as these are the most commonly encountered form of data at this point in time. They're also very explicit, e.g. looking to describe something as a very specific object such as an apple or orange as opposed to a roughly spherical shaped fruit that is orange or red in color, as we see over on Joon's thread. The pose estimation based on key points is actually a very similar problem to cloud to cloud registration, the guys from Correvate did a good demo on this last year.Two years ago, I posted this thread on the CVPR 2017 Conference in HNL.
viewtopic.php?f=39&t=12054
This was my response to questions from Eugene and Jonathan about recognizing images and objects. At that time I did not have the ability to process information using Ubuntu, but I now have a 36 thread system running Ubuntu and today got my RTX2080Ti running on a 4K monitor under Ubuntu 18.04 and and will be able to use my Intel USB 3.0 Neural Compute Stick enabling rapid prototyping, validation and deployment of Deep Neural Network (DNN) inference applications.
Will let you if I get it running.dhirota wrote: ↑Thu Aug 10, 2017 1:41 am Eugene and Jonathan
I forgot to mention that the DJI Spark UAV is using the Intel Neural Compute Stick AI chip to track the hand motions to control the UAV by recognizing objects.
"The Neural Compute Stick enables rapid prototyping, validation and deployment of Deep Neural Network (DNN) inference applications at the edge. Its low-power VPU architecture enables an entirely new segment of AI applications that aren't reliant on a connection to the cloud. This allows deep learning developers to profile, tune, and deploy Convolutional Neural Network (CNN) on low-power applications that require real-time inferencing."
You need a x86_64 computer running Ubuntu 16.04 | USB 2.0 Type-A port (Recommend USB 3.0) | 1GB RAM | 4GB free storage space.
Fruit_labelled_1000_563_85.jpg
I suspect that computer vision and AI based object detection today is roughly where laser scanning was when the first Cyrax hit the steets
-
- V.I.P Member
- Posts: 205
- Joined: Tue Mar 06, 2018 6:56 pm
- 6
- Full Name: Eric Guizzetti
- Company Details: Construction and Engineering
- Company Position Title: RealityCapture
- Country: USA
- Linkedin Profile: Yes
- Has thanked: 14 times
- Been thanked: 11 times
- Contact:
Re: Nvidia RTX 2080 Ti
If your registering with #Register360 then you want as many core as possible. If you want a ton of monitors running high res, want to dive into VR or even just enjoy solid, fast engineering with refresh rates (things moving on your screen) you really should purchase current stuff. I am about to get an RTX2080ti to replace a TITANXP. I will run some tests to share with the room. I am on a mission to test stuff at Pauls level.smacl wrote: ↑Fri Oct 26, 2018 11:38 amAs the owner of a small software house, I spend most of my hours writing point cloud software these days and watch what is going on with the development side of this industry. Development is changing rapidly and more developers are getting up to speed with multi-threading CPU algorithms and to a lesser extent, GPU programming. What I think you'll see as new releases of point cloud processing softwares emerge is that they'll increasingly use these technologies. Note that writing and debugging multi-threaded code is more difficult than single threaded code, particularly when refactoring existing code, and porting to the GPU more difficult still. As such, it is expensive and will appear in dribs and drabs. Buying a new workstation, the faster single threaded performance processors, such as the i9, will be fastest across for many existing applications and complex routines like meshing. As time goes on, more cores will become increasingly more important, so the likes of the Threadripper 1950 and 2950 will more likely dominate in newer and yet to be released software. For point clouds, effective GPU usage beyond just graphics will also start making its impact felt so GPUs with a big core count and memory will make a huge difference for certain tasks on certain software. nVidia cards are currently the better bet as the nVidia specific GPU language CUDA is currently very popular. I was helping a friend choose a decent but cost effecticve point cloud workstation last week and after going through the options we went with an Dell Alienware with Amd 1950 threadripper, 64gb RAM, twin 1080ti GPUs, 2tb SSD and 2x8tb hdd. If it was for myself, I'd probably build rather than buy, but for those looking for hardware support, Dell is a good option.Wdigman - Europe wrote: ↑Fri Oct 26, 2018 10:41 amAgain any software engineers want to chime in I am game. Plan on building a new machine soon so I will be watching this thread.
Eric Guizzetti
Long Beach, CA
www.dlgroup.com
www.eggreality.com
Insta: egg.reality
Linkedin - https://www.linkedin.com/in/ericguizzetti/
YouTube- https://www.youtube.com/c/ericguizzetti/
Long Beach, CA
www.dlgroup.com
www.eggreality.com
Insta: egg.reality
Linkedin - https://www.linkedin.com/in/ericguizzetti/
YouTube- https://www.youtube.com/c/ericguizzetti/
-
- V.I.P Member
- Posts: 205
- Joined: Tue Mar 06, 2018 6:56 pm
- 6
- Full Name: Eric Guizzetti
- Company Details: Construction and Engineering
- Company Position Title: RealityCapture
- Country: USA
- Linkedin Profile: Yes
- Has thanked: 14 times
- Been thanked: 11 times
- Contact:
Re: Nvidia RTX 2080 Ti
scanning normally reports 2% gpu use. But, when moving the scan data around I see spikes in the 60%.
Eric Guizzetti
Long Beach, CA
www.dlgroup.com
www.eggreality.com
Insta: egg.reality
Linkedin - https://www.linkedin.com/in/ericguizzetti/
YouTube- https://www.youtube.com/c/ericguizzetti/
Long Beach, CA
www.dlgroup.com
www.eggreality.com
Insta: egg.reality
Linkedin - https://www.linkedin.com/in/ericguizzetti/
YouTube- https://www.youtube.com/c/ericguizzetti/
- smacl
- Global Moderator
- Posts: 1409
- Joined: Tue Jan 25, 2011 5:12 pm
- 13
- Full Name: Shane MacLaughlin
- Company Details: Atlas Computers Ltd
- Company Position Title: Managing Director
- Country: Ireland
- Linkedin Profile: Yes
- Location: Ireland
- Has thanked: 627 times
- Been thanked: 657 times
- Contact:
Re: Nvidia RTX 2080 Ti
So built me a new dev machine over the weekend with threadripper and RTX capable nVidia GPU with the idea of seeing how easy it is to use the new ray tracing and tensor cores with point cloud and images. I think most developers in this area are also looking at better leverage of the GPU, so I'd expect to see much higher GPU and combined GPU/CPU usage in future updates to scanning and imaging software across most vendors. Be interested to see how much heat is generated when GPU and CPU are firing on all cylinders for an extended period of time.
- Carbix
- V.I.P Member
- Posts: 236
- Joined: Sat Mar 16, 2019 1:36 am
- 5
- Full Name: Daniel Loney
- Company Details: Excelsior Measuring
- Company Position Title: Owner
- Country: Canada
- Linkedin Profile: Yes
- Location: Canada
- Has thanked: 33 times
- Been thanked: 50 times
- Contact:
Re: Nvidia RTX 2080 Ti
Something in the back of my mind tells me that RT could be used for thinning and cleaning up of artifacts. Not so much for single setups but for unified scans, bumpMap creation for comparison to help clean up edge and grid noise. I don’t know what it’s called but the crap we get from fences.
What if we ran RT from 3 setup locations at once and not just a single PV. Could also be used for registration.
Dennis... why such a big screen?
What if we ran RT from 3 setup locations at once and not just a single PV. Could also be used for registration.
Dennis... why such a big screen?
-
- V.I.P Member
- Posts: 958
- Joined: Sun Nov 01, 2009 11:18 pm
- 14
- Full Name: Dennis Hirota
- Company Details: Sam O Hirota Inc
- Company Position Title: President
- Country: USA
- Linkedin Profile: Yes
- Location: Hawaii, USA
- Has thanked: 87 times
- Been thanked: 379 times
Re: Nvidia RTX 2080 Ti
Daniel
We purchased the 4K, 75 inch Samsung TV/monitor about 6 years ago. We use it in our conference room to display VR and presentations when no one else had the ability to upscale YouTube videos many years ago with a video card that could display images and scans on 4K screens. I also have a 4K, 43 inch Samsung TV/monitor that we use as a display to handle 4 times the amount of display information running on my Ubuntu 18core/36 thread processor. It is also used as a portable display, when presentation locations do not have a large 4K TV display. I have a RTX2080 driving the 75 inch screen and a RTX2080Ti driving the 43 inch screen.
I was looking at 8K 82 inch SONY TV/monitor displays last month and to mount several vertical on our 20 foot wall, but decided that most people could not tell the difference between 8K vs 4K vs the cost today.
- Carbix
- V.I.P Member
- Posts: 236
- Joined: Sat Mar 16, 2019 1:36 am
- 5
- Full Name: Daniel Loney
- Company Details: Excelsior Measuring
- Company Position Title: Owner
- Country: Canada
- Linkedin Profile: Yes
- Location: Canada
- Has thanked: 33 times
- Been thanked: 50 times
- Contact:
Re: Nvidia RTX 2080 Ti
Pushing the points to 8k should be interesting. With the software we use, most of the computing power is done with the CPU 2990wx the 1080Ti only see about 30% load at most...
I wonder where the bottleneck will be with 8K.
I wonder where the bottleneck will be with 8K.
-
- V.I.P Member
- Posts: 958
- Joined: Sun Nov 01, 2009 11:18 pm
- 14
- Full Name: Dennis Hirota
- Company Details: Sam O Hirota Inc
- Company Position Title: President
- Country: USA
- Linkedin Profile: Yes
- Location: Hawaii, USA
- Has thanked: 87 times
- Been thanked: 379 times
Re: Nvidia RTX 2080 Ti
One of the interesting concerns of Nvidia GTX products has been the PCIe slots of space used recently. Using GTX1050Ti video card/GPU for some applications in the past, the recent pricing has not receded from last year's pricing of US$200 to US$250. So without checking, I purchased 3 Nividia GTX1650 video cards/GPU for US$145 each and found out that it takes up 3 PCIe slots of space instead of 2 PCIe slots like the GTX1050Ti. It maybe come a problem if you have only 6 PCIe slots and 28 PCIe lanes, or worse 16 PCIe lanes for your workstation since it will take up 2 of the 3 X16 PCIe slots that many MB have.dhirota wrote: ↑Sun Apr 14, 2019 11:32 pm
Shane: I have decided to spend a few US$ to help the economy and expand our research efforts by installing a EVGA Nvida RTX 2080Ti into our i9-9980XE, 18core/36thread, 128GB RAM workstation to test anything that you might produce (hopefully soon) as well as some other software. I found out that the RTX FTW3 2080 Ti requires 3 PCIe slots of space versus the RTX 2080 that requires just 2 PCIe slots of space, so decided to install a RTX 2080 to replace the GTX 1080 in our dual Xeon,