Updated: Apr 30
When one does a ton of testing on various PC and server hardware combinations, one tends to be afforded the opportunity to see for oneself which of the goodies delivers better than the others.
Being a former silicon pirate in my youth, with focus on microprocessor technology has of course created a personal bias towards CPU evolution circa 1986 onward and I have gone from designing them myself to appreciating new design from others in a morbid fascination of sorts in the modern era as we go.
I never cease to be amazed at the chemistry of the things and how AMD is making copper in their CPU offerings a factor in the never ending Moore's law realities.
I myself started out with telecommunications chips in digital phones and even won some design awards before I realized whatever you built in electronics was obsolete in mere weeks and sailed to the computing and networking side of the equation instead.
When Intel was the only game in town, bar a short Apple uprising based on IBM chippery and an interesting attempt from DEC via the Alpha chips back in the mists of time, nothing much changed in the Microprocessor world due to the monopoly Intel enjoyed in this space.
Sun made interesting SPARC chips for a short while but never realized the potential and I worked at HP on the PA RISC goodies which was stopped for some reason only HP execs could ever fathom while they came up with a more evil plan with Intel on how to chuck $60 Billion dollars down the toilet with the Itanium chip I fondly call the Itanic and all the fun and games around how not to develop technology that this chip became in business schools everywhere for us to have a hearty hoot over.
Apple hooking up with Intel frankly put the Microprocessor biz back 20 years but they are making up for it now with the ARM based magic as they sail with that fascinating fleet of silicon.
The rise of AMD with a better chip design architecture but being hamstrung by a poor manufacturing process held AMD back but the Opteron CPU signaled that things were about to change and change for the betterment of us all.
Technically the thing was pretty awesome.
I could imagine and see what that their Opteron chip could do if they had the same foundry capabilities as Intel back then due to my knowledge of silicon foundry arts and associated games I myself played with silicon.
The battle angels were trumpeting at the gates of the Jericho that was Intel but they had their headphones on and were steadfastly ignoring everything else that was transpiring around them while their walls crumbled and fell from the acoustics from said angelic trumpet shenanigans.
Intel were were stubbornly clinging on to an eyes wide shut MO which was a dire strategic error.
I and hundreds of others told them this at every opportunity but all they did was transfer the workload to the Qiryat Gat in HaDarom.
This was also a big mistake.
That move signaled a band aid forever MO and while the Israelis are the masters at that shit, this is only a strategy to buy time while you fix the greater problem and they just never did.
It is clear from Intel's rise and demise due to complacency and arrogance that they needed a competitor like AMD all along to violently kick them in the nether regions when they went to sleep from sheer boredom, which based on the patches and band aids they kept applying instead of fixing the foundation base problem it was evident they were now doing on an almost permanent basis.
Intel got away with selling their shocking "meh" for outrageous prices for way too long as well.
They should have had a leader that put 23% of sales margins back into the long term health and R&D that mattered for Intel for tomorrow.
When you suck blood out of a stone and never put anything back, it becomes a barren rock.
Once AMD made thinner and better wafers than Intel, never mind the superior design of their CPU, it was clear to me that Intel were going to have a real fight on their hands and they are now the far behind underdog and in deep sticky doodoo due to the Optane rabbit hole games they were playing which took their eye off the CPU ball.
Now AMD has 5nm processes nailed down and are playing with 3 and 4 in their labs while Intel flounders around on 10 and 12.
I even see some of the newer Xeon chips are still 14nm!!
This is itself a major big deal in Microprocessor design these days.
Both Apple based ARM silicon and AMD silicon is 5nm now.
It is only years of same old same old Intel purchases and loyalty that keeps their sales as high as it is.
If people like me were making the purchasing decisions it would be an 89% dominated AMD world we live in.
Cheaper and faster being the mantra that got them there.
It was inevitable that AMD would become the more expensive of the two when they attained dominance but battles between the two should drive prices down and make the technology better in the same stroke.
Keeping both AMD and Intel on their toes though is an important factor going forward and ARM and Apple have stepped into that shoe with quite the dash of panache.
Combining GPU and CPU tech into a single package a la Android cell phone has long been something of keen interest to me and it has perplexed me that this tech took so long to get to the desktop.
I still think future phones will in fact replace all other computing devices as those things evolve but not quite as rapidly as my Sci-Fi oriented mindset is angled on the matter.
Obviously not enough people watch Star Trek re-runs to get the idea of a small hand held device capable of technological miracles and such.....(jeez!)..
That is computing at the edge, not the core, by the way.
My Samsung Phones have been powerhouse mini Linux computing machines for quite a few generations now and they all used this combined GPU and CPU tech to great effect.
What Apple have done with their new M1X and the soon to arrive M2X platform is a serious wakeup call to both Intel and AMD and serves up a hard slap to those folks over at NVIDIA, lest they fall asleep behind the wheel of their Graphics Leviathan, a la Intel.
Microsoft are working hard on Windows for ARM platforms, so it is clear where the smart money is going here.
Meanwhile, AMD's Ryzen and EPYC series of CPU has gained quite the foothold and is technically vastly superior to anything Chipzilla has made to date and they have both been impacted by the TPM hardware thang - AMD more than Chipzilla only because AMD are talking about it openly.
My lab experiences and testing of software based TPM BIOS maladies tells a different story however.
In the server space, the AMD EPYC simply trashes the Xeon six love, six love any which way to Sunday and they are already working on Zen 5 next gen, with the new Zen 4 launching in the present time-frame.
Zen 4 is going to present Intel with a very serious whupping.
Zen 3 was a bad mauling for Intel but Zen 4 is definitely in the order of serious whupp-ass stuff.
I started getting serious about AMD via their Ryzen line again from the 1700X series and I did have a few of the Previous gen 8350 Black editions in my own platforms at home that I was writing code on in the cheapest way possible.
That is to say I had me an experiment to build a coding rig for as low dollar as possible.
We also started doing video reviews of this newer Ryzen technology in 2019 with a small group of ex microprocessor design folks from IBM, HP and Cray for a more seasoned microprocessor tech POV from former specialists in this particular chippery subject area.
I personally stayed away from the Ryzen 2000 series stuff due to the crazy temperatures I saw in the testing lab but I played with them all in the lab regardless.
The Ryzen 9 3900X did impress me in terms of its capabilities back when I tested the darn things, but the price, it has to be said, never impressed me a whole bunch at all.
I grabbed me a few 3800X CPU and started to enhance my Chess AI Code on NVIDIA Quadro GV100 and the lesser RTX cards and got dragged into the cooling tricks that came with that Ryzen platform by default.
When the 5900X came along and we set it up in the lab for all the video testing malarkey and all that jazz we get up to in the testing lab, I at first thought we were doing something wrong as it did not behave like the rest of the Ryzen herd or anything else from Chipzilla we had ever tested before, in fact.
The testing results were almost too good to be true and it ran real cool at high clock frequencies to boot.
This meant that it could be paired up with my other surprise dark horse, the Be Quiet Dark Pro 4 cooler, to great effect.
Be Quiet is a different kind of PC parts cooling company by the way.
First off they are not Taiwanese or Chinese, they are in fact German!
They also dabble in AIO water cooling solutions and make very silent computer case fans, hence the name, Be Quiet.
When the prices on the 5900X CPU started their free-fall back in January of 22, I started to become interested in replacing everything else I did not already have converted to these magnificent Ryzen 9 chips and started out replacing my 3800/5800X fare one by one with ads on various sites that traded in computer bits and bobs aiding me as my partners in this sweet update schema crime of mine.
Every time I sold one or two I upgraded one or two to the 5900X.
I now have 7 rigs, two are about to get the new Threadripper 5995WX CPU upgrade from the Threadripper 3995WX chips now in them and another 5 rigs that are now all shod with the Ryzen 9 5900X fare.
I had a mix of 3800X and 5800X Ryzen 7 CPU in my collection before I did this 5900X switcheroo by the way.
Oh, and I had one Ryzen 9 5950X that one of my mates persuaded me to sell him because of all of its water cooling gizmos and its buddy went the same way the month prior to his pal, so I no longer have any 5950X rigs in my lineup as they run way too hot for my home office setup liking.
Now, for my purposes, the 3800X and 5800X CPU were ideal from a cost and function point of view, which was then just for Chess AI workloads.
From a general computing POV however, neither of those CPU was up to the job and they also both ran way too hot for my comfort and it added a lot of dB into the equation as well.
They also stuttered badly which I solved with actual TPM modules by the way - bar the one rig running Ubuntu Linux that is.
The software BIOS TPM thing is a fat waste of time and renders any computer that has this setup completely useless.
Whoever came up with that stunningly stupid idea needs banishment to a far flung moon of Saturn for their crimes against computing.
The other differences to my new 5900X compute experience is the absence of all of that heat that those 3800X and 5800X thangs generated with the accompanying acoustics shenanigans.
The sound upstairs in our home office work area has transformed itself from the pitch and whine of six 120mm fans per rig kicking in their full high performance profile which I used to endure while I worked, to the new tick of an uncomfortable and deafening silence with these new transformed and now deadly silent 5900X rigs.
In fact, I even put two old noisy 120mm case fans back in the core one to alert me of the workload crunching it was doing as it gives me peace of mind and audio cues as to when it is done crunching as it too falls totally silent when it has.
This has also allowed me to promptly investigate what malware it is being attacked by when it spins up and starts processing something when there is no workload running on it.
I am now catching nasty software within seconds of it activating just via catching CPU activity that should not be happening via the fan noise spin up and my active curiosity working in cahoots for the root cause.
Ultimately I am going to have to get used to the new silence though as I have a horde of new blade-less cooling fans heading my way and the sound free state of these new things is like uber creepy.
I did in fact play a game of Silent Wings 2 vs Silent Wings 3 case fans wars and surprisingly the 2 won, so all of my rigs now sport the silent wings 2 120mm PWM fans and all I can hear now is the clack of my keyboard keys as I type and the birdies chirping in the garden through my open window.
I installed a little blue LED on a long stalk on all of them to give me visual clues of CPU activity.
My offices are now running some 25 degrees C cooler than usual and my various Vornado fans are about to become garage shelf-ware as I am now freezing my ass off in there.
I may keep one just to circulate air on its lowest setting which is also pretty silent.
The prices I paid for these same 5900X chips over the span I acquired them has been pretty interesting.
The first one I paid $549 for back when they were new and I had to wait 4 months for one as well, though AMD did offer me some test units which I declined.
The next pair I did at Best Buy for $449 each on 3/18 and the final two I did for $384 each on Friday 4/15.
By Monday 4/18, their price had fallen to $379 each @ Walmart and by the time AMD launch the new Zen 4 fare I expect them to be going for as little as $299.
This is an absolute bargain because the new Zen 4 stuff is going to be through the roof and be hard to get for some 18 months or more, so purchasing 5900X fare within the next 2 months or so is going to be the sweet spot for grabbing the darn things.
I advise you not dilly dally over the matter either.
The other worrying thing that is happening though is that the supply chain shortage is about to get much worse than it currently is.
Our Taiwanese friends at TSMC claim building a US based foundry is an exercise in futility, but I seriously take issue with that self interest claim.
In a scenario where China invades Taiwan, guess what will happen????
Pure 24 carat disastrum for the computer and IT industry!! Nein danke kamerade!!
Then there is what is happening in China with their latest Covid-19 lock-down insanity, which is also underlining why having all our eggs in the Chinese basket was a very serious strategic error.
China is doing this by the way for political statement purposes only, per my wealthy pals in Shanghai at any rate who are currently on a Government enforced diet.
The Chinese Government have apparently developed serious problems with Jack Ma type characters getting uber rich and are going back to their 1950'S Marxist malarkey narrative and turning their back on the success of the nation.
Seems they may have another type of revolution on their hands to deal with as a consequence..........
They clearly do not care if the people starve while they go about it either.
Some of the videos I have been sent from there from my Chinese pals is pretty darn disturbing.
It seems to me that the Chinese are busy squandering an opportunity they will never enjoy ever again while they are at it, which will see China sink back into slow decline and become a third world economy all over again with all of the classic hallmarks of poverty and starvation that goes with that game.
This is most unfortunate indeed.
I have spoken to many big corporations who have been slapped in the face by this situation, who are now all rushing out of China in a big crush to be first out the door.
As a result, we may have a period where the latest and greatest technology will also have prices aligned with insanity as we have seen with GPU cards these past three years though that seems to have been normalizing somewhat in recent weeks.
A temporary reprieve from insanity is my take on that situation.
I myself bought the latest PCIe 4.0 GPU cards for my fare pre Covid as the prices started to soar to insanity but only 2 of my rigs, the Threadripper rigs have the high end TUF 3090 Ti cards in them.
Everything else I now have is now running 1070 Ti or 1080 Ti GPU as the performance difference and price made me conclude these new PCIe 4.0 GPU things were neither technically nor cost justifiable in any way.
In fact, I may even switch the Threadripper rigs back to 2080 Ti cards for what they do and pocket even more beans on GPU profit while I am at it.
I am currently searching for used 2080 Ti Founders edition GPU across the interwebs thang.
I essentially got my GPU for free by buying the PCIe GEN 4 stuff at the right time and then selling it at an insane premium after using them for a few weeks and asking the obvious question, Why?
In fact, my whole 5900X and combined GPU upgrade shindig actually made me quite a lot of dollars.
I was also able to sell my 3800X and 5800X CPU at $220-$300 per chip to offset what I was spending on the new 5900X and the handsome GPU gains mean I still have money in the kitty for other upgrades like the more silent fans and such even with the 5900X upgrade and all the RAM righting I did with the new 1 TB Kingston KC3000 NVMe SSD they now all sport.
I also sold about 90 assorted NVMe 3.0 SSD I had lying around doing nuthin....
The only thing I did not upgrade was the X570 Motherboards.
I think I will run these for the next 24 months as well before I upgrade to the Zen 4 goodies that should appear starting in April-June of 2022.
This means I will have to purchase 5 new Motherboards but then I can sell the X570 motherboards complete with the Dark Pro 4 coolers, RAM and the 5900X chips.
I will transfer the Kingston KC3000 NVMe SSD to the new X670 chipset motherboards when they come out but I will want these X670 thangs to be stable for 18 months if I even play that game again.
Zen 4 has a new LGA Pin schema in a much bigger CPU package, so the X570 motherboards will not be able to accommodate the new chips which will also require the new X670 chipsets.
In fact I may even skip the X670 generation entirely and wait for the Zen 5 goodies instead.
This will be pretty hard to resist but I will look long and hard at the test units that come my way before I buy again.
PCIe 5.0 GPU is going to need some examination as I cannot see the point based on the PCIe 3 to 4 thang for GPU cards at the very least, at any rate.
I did a lot of testing for GPU fare for scientific compute offload purposes and that for sure is worth the investment, if you are a scientist with such a workstation and workload need that is.
My conclusion for the GPU vs CPU workload on the 5900X CPU was the 5900X won that argument because of the price point.
If you can afford a pair of 3080 Ti cards per rig you can change the super-computing dial in favor of the GPU offload model, but for the majority of us (gamers and power workstation users) a 5900X paired with a 1080/2080 Ti is still a pretty powerful combination.
More to the point, it is also at the right price.
Even power CAD users need to stop and look closely as a 2080 Ti paired with a 5900X running Autocad or P-CAD that drives two Samsung 49" 4K Monitors is also more than adequate for the task at hand.
I built such a pair of rigs for one of my pals these past two weekends and it made me notice they do in fact do the job real swell.
I used the MSI X570 Godlike AM4 motherboards on those and armed them with four 1 TB Kensington NVMe PCIe 4.0 SSD each via the XPANDER-Z Gen 4 add-in card that only comes with that particular motherboard.
I did that because this motherboard also comes with 10GbE and WiFi 6E capability.
At $698 per motherboard it was not a cheap solution either.
My pal claims his utilization and work throughput on these rigs went up 23% and he is very happy with the results indeed.
He also added one to drive his 3D printer from the models he builds from his CAD operations and he is also experiencing much success there, though 3D printing is a slow and tedious affair at the best of times, no matter how powerful the computer that drives it happens to be.
He was into custom loop water cooling on his last setup but I air cooled both these new rigs of his with a Be Quiet Dark Pro 4 and two additional Silent Wings 2 120mm PWN case fans either side of the Dark Pro 4 cooler block in the case.
Some folks add a third fan to the Dark Pro 4 but I left that standard with the big and little fans it came with as they operate between 62 degrees C and 71.6 degrees C @ 4.5 GHz full load so it is pointless.
Max temps are 71.6 Degrees C at a full load of 4.5 GHz @ 16dB noise levels.
We initially tried a pair of 2080 Ti Founders edition cards in these but we were not using the second GPU much so we went with just one 2080 Ti Founders edition GPU per rig.
We also tried the newer 3080/3090 Ti fare but could not tell what we were getting vs the 2080 Ti so we flipped back to these 2080 Ti cards.
He will put the A40 cards in them when his company buy them for him.
The biggest benefit is all he can hear now is the 3D printer clacking and clicking around and his office temps also dropped by 22 degrees C with these new rigs.
This is his home office setup as he seldom goes to the office these days.
He does have a Threadripper rig at his work office for his CAD work with tricked out A40 GPU cards but he claims the only benefit is the 12K Monitors they drive for high precision work.
He goes to the office every other Friday for 2 hours to do that piece and is unwilling to shell out for the gear that would take at home.
For those of us with general computing and serious gaming workloads the Aorus X570 Elite motherboard is also the best one for the 5900X and that too is at the right price @ $179 each these days...
Soon the cost of a very powerful desktop rig is going to be pretty low.
For $2715.14 plus taxes you can get yourself quite a powerhouse rig that can cope with anything you throw at it fairly easily.
Note that this includes a 35" curved screen ultra wide monitor as well as my fave keyboard and mouse setup for CAD and Visio work.
All you need add is a Logitech 4K Camera and a Blue Snowball Microphone and you are in the conference call business.
This Motherboard has a RealTek Digital output port as well as something to plug speakers and headphones into and this case has headphone jacks on the top to plug easily into.
I am currently building 6 of these rigs with Dual 1 TB NVMe SSD in them so those are just $180 more for the second SSD in a RAID 0 config (Mirrored 1 TB boot disk) on some and others are 2 TB storage setups with one 1 TB NVMe boot disk plus a 1 TB archive volume.
The guys who want these are all podcast engineers who also make video advertising shorts on Windows based software setups.
Some are going with Dual monitors but we found this a bit much and you need a real big desk for that game as well as Dual 3060 Ti cards at the entry level.
There is only one use case that benefits that rig setup.
Obviously the Podcast setups are decked out with their existing Microphones and equalizer setups with all sorts of Microphone filters and the usual software that manages the editing of that stuff.
And these guys all have exotic AudioQuest HDMI and other cables that actually cost more than the computer rigs cost, per cable (I kid you not!).
My Swedish pal Per has two 12K $4500 active HDMI cables on his rig!
They used to buy Intel Workstations that came in landed bare-bones sans monitors at $6500 apiece.
That's pretty rich fare for most folks.
The selling point of my Ryzen 9 build is the silent case I selected and the raw performance they offer.
These benchmarked 27% better than the exotic Intel Workstation Xeon shod platforms too by the way.
Those Intel made Workstations are pretty awesome, I must admit.
Some of the Graphics design guys went with TRX40 Threadripper Pro setup armed with the 3970X 32 Core CPU and dual 12GB DDR5 2080 Ti cards with 128GB RAM plugged into the TRX40 motherboards.
That particular selection offers the best bang for the buck and is also all air cooled by the way.
I have started receiving all of the parts for the forthcoming Threadripper 5995WX games and amuse myself cutting pipes and tubing on the special tools Titan Rig sent my way for the forthcoming pleasures and sweet cussing soon to be heard emanating with much passion from my lab work area.
I have also started to note that the 5900X platforms can also do most of the work of the current Threadrippers but obviously not as quickly.
I have also yet to see any stuttering on the 5900X shod systems but then they do all have a cheapie TPM module plugged into them all.
For those of you with a nose for a bargain who want a kick ass gaming and workstation rig, ponder the 5900X setup deeply.
It R a Bargain!! IMHO of course!