Search

The Book of GPU and RAM :Over-Clocking

Updated: Oct 6




Well hell, I never thought I would be taught anything new about GPU cards or DIMM modules in Computers for every day use cases, but boy was I wrong about that!!


What started this little adventure for me was puzzling why some folks who were using PC Benchmark like I was got wildly different benchmark ratings for their various Graphics cards, memory and CPU.


The exact same gear I had in fact.


Now if you have CAD or Gaming software like I do or even Flight Simulator rigs you learn quickly that this software takes total control of your GPU and proceeds to stress it to the max potential it has on offer sans any tuning effort on your part as the user.


All I had to do with this was make sure XMP was on in the BIOS and let the App worry about the Graphics available with the GPU that was in the machine.


Your monitor you are using will actually dictate what your eyeballs feed to your synapses in your head re resolution on the Graphics side of the equation etc. (Assuming your GPU can deliver 4K graphics output).


I have been laboring under the illusion all these years that the OS, be it Windows or Linux would max the GPU res as the application demanded it.


Of course most folks do not have the monitor to match their high end GPU card which I have always found mildly fascinating.


Applications like a video game or CAD software take control of the GPU hardware for its purposes and they all run on common OS platforms like Linux or Windows.


My various flight sim systems do this as do these various high end games like Crysis and my CAD apps like P-CAD as well.



However, when you whack the latest and greatest GPU into any PC and benchmark it al cruda, the results are almost always very, very disappointing.


So I determined I needed to finally get to the bottom of this puzzle after my rigs got fried and I selected some of my surviving GPU and got some new PCIe 4.0 capable ones and tried to emulate the best bench-marking stats others had achieved with the same hardware I had.


The first lesson I learnt here was actually about RAM Clock speeds. RAM in the motherboard that is, not the GPU DDR RAM.


I mean I knew what these RAM clock settings did, I just never bothered as I felt that over clocking was going way beyond the normal capability of the gear.


What I had missed however is that RAM seldom just works at the right frequency or settings. You do have to work at that aspect a bit.


OEM's have been building these computer bits expecting people to subject them to max this and that all the time but they are not even set to optimal settings.


When I last played with OC I had liquid nitrogen and stuff overheated and broke often.


Not because I burnt it, but because the temperature differential between cold and instant performance breaks at the chemical boundary of germanium and silicon or whatever it is they use these days to make electronic substrates with.


Memory is different with OC aspirations to other components as not much memory has reliable overclocking capabilities in my experience.


My prior stance on overclocking also explains why my components have been living five times longer than my pals bits and pieces, I only stress them in the apps I am using them for at that moment.


After that it goes back to base under clocked settings. Very boring.


Ryzen Master helped me out greatly with this noble max benchmark quest in terms of understanding it and setting the RAM to MAX performance all the time by the way.


Most of the time the OS just takes what the BIOS and chipset drivers give it and are not very intelligent about the power of what is in the average rig.


Using Ryzen Master takes a whole lot of manual tuning you had to do in the past to make this happen on Intel and Prior AMD FX series chips out of the equation.


It does it for you. For the CPU and RAM that is. On Ryzen Chips at any rate. Don't try Ryzen Master on any Intel i7 or i9 CPU!!


What I do not understand is why these memory manufacturers don't tell you in a small leaflet what the best settings for the RAM you just bought are.


Taiwanese humor is my conclusion on that score.....


I have in the past spent as many as three weekends tuning specifically built custom rigs to the max to this end.


However, this was usually internal to a single Application that took control of the hardware resources. I was tuning a specific app.


Now I find that to get the best results from your hardware at the OS level you have to overclock the individual components or set them to their max capability in the BIOS one at a time.


This is tedious.


Often you use the combo of BIOS and various tools in cahoots with the chipset driver software like Ryzen Master to drive it to the edge of the performance envelope.


The graphics and Memory card OEMs expect you to know this stuff by default btw!


You will never be able to tune RAM if the BIOS does not enable XMP!! No XMP = forget about other performance tuning antics and aspirations you may have.


You can use various clock settings to get the most out of RAM chips. I suggest for the cheaper LPX 2400 type RAM you just get it to it's max frequency and leave it there.


17-18-18-18-36-2T and such clock settings take a fair bit of dithering over so prepare to waste a lot of time on this sorta setting stuff if you venture here.


My RAM clock was always setting to 667 Mhz by default and PC Benchmark was telling me my stuff was scoring very poorly in the bottom 15% for everything in my rig.


So I bought the best and fastest RAM the other day for all of my 6 Ryzen rigs and got me a self taught lesson in setting it to the Max settings it could manage without failing.


Cheap RAM does not respond well to XMP settings in the BIOS and will default to the lowest clock that is stable for the motherboard and CPU when you turn it on for the first time.


This is why a first power on of a new rig may take 5 minutes before it hands over to the OS. The BIOS is figuring out what is stable. Bare minimum stable that is but it still won't turn on XMP for you.


Many people think their installs failed and then bring me their rigs to get going and all I do is talk to them for 5 minutes and it just comes up while we are chatting.


Usually, that is.


Good quality RAM will take the XMP settings no problem but you have to turn it on!


After the BIOS takes the XMP settings then you can go into Ryzen master where you will see switches for max frequency as well as some other settings under Memory OC.



BIOS always reports memory frequency as being half of what it actually is by the way.


RAM that reports its running at 1000 MHz is actually running at 2000 MHz. Most of my old RAM was 2400 but it was running at 667 x2 = 1334. Now there are hundreds of clock combo's you can play with to set it better but each one requires a reboot.


You have to set this RAM to clock at 1200 for it to work at 2400.


This RAM clock settings lark is not a good use of your time unless you happen to know the best settings offhand. Google it is my advice.


Now good 2400 RAM I have over-clocked to 2666 in the stable zone. The chips you can do this to are few and far between by the way.


3200 RAM I am not having much success with so far getting to only unstable 3466 peaks so far with that stuff. I have left all of it at 3200.


I did play with some 3600 RAM and found that too was not worth the OC efforts.


So anyways, once all of my 6 rigs were set I considered myself fairly well educated on the subject of RAM and RAM overclocking capabilities.


Now on to the problematic GPU stuff.


So, just like with RAM, I noticed guys on Reddit and other hardware forums talking about all sorts of RAM card settings that were Greek to me.


I randomly picked the nVidia GTX970 and a new AMD Radeon XT5600 from my GPU bin for two ends of the spectrum for my messing around purposes.


I initially bought an entire box of GTX 970's for around $89 per card because a well respected OC guru claimed it was better value than the GTX980 series.


However I was never able to get it to perform in a benchmark as required with the results desired.


In my custom flight sim apps or CAD apps, no problem at all but bench-marking in RAW Windows or Linux was a fat failure with that card.


As I started to read comments on this subject online I started to realize these folks were all using software from ASUS or MSI to tune the GPU and force the OS to load these GPU Over-Clocked defaults at boot time.


Aha I thought!! Autoexec.bat and Config.sys antics did not go away after all!!


I was not wrong in this thought stream either.


There are quite a few of these GPU tuning tools you can use for this purpose and you do not need to have any of the manufacturers products in your computer to use them either.


I chose MSI afterburner after not so stellar results on about 11 different GPU tuning tools. I downloaded MSI Afterburner from one of the blogs I had read on tuning GPU's which took me to the MSI site.


I already have a ton of MSI products in my collection but none of their GPU offerings.



Be Aware that NVIDIA Laptop GPU like the MX150 and MX250 need specific GPU tuning tools for this to work well and honestly this is a thorough waste of time.


Afterburner will not work on these MX series GPU from NVidia, I got the ASUS GPU tuning tool to work with it but it led to my Huawei Mach29 freezing solid.


I just found out the EVGA Precision X1 tool works real swell on these MX rigs FYI. One reader of my blog from Taipei sent me some data and a link and it was very useful, thanks Alex!


Most of these laptop rigs expect you to add an eGPU and expect you to have a nice eGPU box with a fat and powerful GPU installed inside of it.


In this case the MX150/250 series chips become the GPU shim-runners and the eGPU card does all the heavy GPU lifting.


You connect the eGPU to the laptop via an USB-C Cable.


Now if your laptop uses 4 lanes you are dandy, if it uses just two like my Huawei laptop does, do not get too carried away whacking the biggest GPU inside your eGPU box because it won't be able to use it fully.


Those combos turn otherwise light laptops GPU wise into powerful workstation class rigs.


If your Laptop Chipset is a lower end AMD Vega or Intel 6xx forget about it!! It is what it is with that crud.


It turns out the GPU manufacturers are very conservative in their GPU default card settings. Some of these GPU can over-clock to 4GHz and 95 degrees C!!


Just an FYI, do not try performance tune a laptop with an NVidia MX series chip and an eGPU with a Radeon inside of it. Match the GPU vendor at least!!


The 128 bit MX is expecting to talk to the bigger NVidia silicon found inside of an RX2080 or some such GPU in your eGPU rig.


There are also MX150 variants, for example the Razer Blade Ultrabook uses the MX150 25W chip and all 4 lanes. My Huawei uses the 15W MX150 but only 2 of 4 lanes for GPU traffic.


With that MX150 25W 4 lane setup you can max out any eGPU to your heart's content with a beefier GPU inside the eGPU box.


Also note that some people are trying to use the eGPU to make the laptop screen faster and better.


The Huawei has a weird 3K screen and I have pronounced efforts to that end a total waste of valuable time you will never get back.


Plug a good quality monitor into the eGPU box and use that for God sakes people!!


Dragging around an eGPU just to make the laptop screen handle games better is an exercise to enhance futility itself.


Mine is plugged into a nice 4K Samsung 32" monitor. My eGPU is using one of my many EVGA GTX970 cards.


You cannot even compare the Laptop 3K touch screen to this monitor and it does not freeze the 4K monitor at all.


Using the eGPU with just the Laptop screen is not a smooth or happy experience.


Note that when you overclock these PCIe x16 GPU cards to the tune of 4GHz @ 95 degrees C, they will obviously not be having a long life.


I chose conservative settings for base and memory clock that did not violate my thermal envelopes.


Unless your GPU and CPU are water cooled with the right cooling gear you have to pay attention to this thermal envelope.


I have recently found that most of the low end liquid cooling stuff for gaming enthusiast build purposes is somewhat of a serious joke.


Good liquid cooling does make a difference but it is messy and potentially hazardous to the long life of your Computer.


You can tune your stuff to stable zones with air cooling pretty adequately in most cases.


I have removed all my liquid cooling from all of my rigs and switched to the Noctua air cooled gear as it is actually running at lower temps and at a higher base clock than what my water cooled H100i fare was doing.


Now, when you tune a GPU on an air cooled rig you have to be careful about the air temps the CPU will be using. They are all sharing the same pool of air!!


You will also be adding molto cooling fans to your rig with air cooling by the way, make sure you think about the direction of air-flow in and out of the box you are using with all the fans.


One of my pals had the fans working against each other and he ended up with 100 degrees C GPU in the rig as the air was stalled inside the rig.


After I made sure the air flow was pulling in from one end and sucking out the other in a straight line we got it to idle at 36 degrees C and settle at 75 degrees C, so this is important!!


I observed that the GTX 970 did in fact have a wide range of tuning capabilities as reported by various tuning gurus but I did not like the whine or heat that came with it so I dialed the max back quite a bit.


On the bench-marking front I went from 17% to 54% @ 54 degrees C. I only increased base clock by +175 and Memory clock by +200.


I played with it till it failed by the way. Just for giggles. I had three fire extinguishers handy just in case spontaneous combustion was the game of the day....


After 54 degrees C it leaps to 68 then to over 94 degrees C.


My CAD and Flight SIM apps take all they can get but keep temps in the optimal zone as well sans tuning efforts.


I dragged my rigs into the Nuclear submarine or UFO category as a result on PC Benchmark tests.


My SSD numbers are insane by default as PCIe 4.0 and NVMe SSD in X570 and TRX40 rigs love each other a lot. I scored 332% above component average in benchmarks with that shit.


Believe it or not there is one component that also makes a huge difference to bench-marking scores.


The good old PSU.


I have a few 650W, 1000W with the odd 1200W PSU laying around here in my benchmark test lab.


The bigger the PSU the better the Ryzen CPU handles life and maxing out the cores.


The CPU Thermals were also vastly different with each PSU for some reason.


I saw vastly different CPU OC Benchmarks just with different PSU in the same rig.


The GPU overclocking is also greatly aided and abetted by a very good PSU unit in your rig.


If you have Dual GPU you will need to look hard at water-cooling everything as well as having the best PSU money can buy.


My Advice therefore is to not skimp on the PSU quality or wattage range.


Get Gold or Platinum 1000W units every time as a bare minimum.


I used to like Seasonic PSU but they are using crap quality capacitors in their stuff these days so I have switched to Corsair HXi series.


EVGA also make a mean good quality PSU as long as it is a SuperNOVA Platinum series unit.


I am actually assembling PSU from several manufacturers for a Threadripper rig build video and will post a bloggie about that stuff with my findings.


We have monster dual GPU and Optane bits in these guys that need serious smooth power.


As you can see there is much more to building a high performance rig than just whacking the best go fast performance goodies in it.


This is a marvelous time suck if you are not careful by the way. Your other half might not be quite so appreciative of your time spent on this while you do the Alice in pc wonderland thang.


Anyways, I was just notified I won the 156th GameKnot Grand Masters Chess tournament, which was an EPIC affair that started in August of 2017. A long ass time ago!


I need to bask in the glory for a minute or three on this one....and toast this fine victory with some best quality Russky Standart Platinum Vodka and some special reserve kaluga huso hybrid caviar.........


Victory was sweet!! Dunno why that one dragged on for so long either........Давайте выпьем за то, чтобы мы испытали столько горя, сколько капель водка останется в наших бокалах – Davajte vupjem za to, chtobu mu isputali stolko gorya; skolko kapel vodka ostanetsya v nashikh bokalakh!




שנה טובה! ברכות לשנה החדשה









chaanbeard.com, IT Tech-Talk Blog focusing on AMD and Nutanix with Cloudy things

  • Grey Facebook Icon
  • Grey Twitter Icon
  • Grey Google+ Icon