No temperature sensors for GTX 1080 Ti – Dual card setup

Hello there,

before i got my brand new GTX 1080 Ti this week i had a dual card setup: GTX 770 + 780. One for driving my two monitors (770 2GB) and one for testing some CUDA accelerated machine learning projects (780 6GB). With this setup i saw temp sensors for both cards in Apps like iStat and Hardwaremonitor, but since i've swapped the 770 with the 780 (PCIE slot 1) and the 780 with the 1080 (PCIE slot 2) i only get temperature info from the 780.

Had some problems at first getting the 1080 Ti to run, but after reinstalling the newest Clover two times, installing and setting the newest Nvidias WebDriver to be loaded, the cards works great together! The 780 is now being used for screen output (sometimes CUDA computations) and the 1080 Ti entirely for CUDA stuff. The first card uses its VRAM to control all the Metal Shader–UI stuff (needs 1,5 / 6 GB on average) so that the 1080 Ti has all the 11 GB available for Tensorflow etc.

But since running Tensorflow computations on the GPU puts a lot of stress on your card it would be crucial to know the max and average temps, so that i know if i the stock coolers are good enough.

 


 

Maybe one of the following things could fix it – has anyone tried one of these?

  • Do i need to add/change something in my Clover config?

  • Do i need to use a different FakeSMC.kext?

  • Do you know a way to access the cards temperature and fan variables directly through executing CUDA C++ code? I've found this code on stackoverflow, but haven't tried it out yet – don't know if it will even work on macOS (some CUDA API functions were only written for Linux, like nvidia-smi):

https://stackoverflow.com/questions/15709958/get-temperature-from-nvidia-gpu-using-nvapi

 


 

To the 1080 Ti users: Did you see your temp/fan sensor information right after installing the card? If not, what did you do to fix it?

 

Thanks for any kind of help or directions to fix it myself!

submitted by /u/subzerofun
[link] [comments]
Share: