top of page
Search
  • Writer's pictureFred

Bella Figura

Updated: Apr 30, 2022



Perception, it seems, is everything. Ask any Italian fashion guru.....


Like most IT professionals, I have always owned an interesting collection of computer systems, IT Monitors, Mice, Network switches, Network routers, printers and many keyboards to name but a few of the many devices in my own personal gizmo collection.


My more exotic systems included things like a Sun 10K, IBM AS400's, Cray 5 and Cray 6 systems, Hitachi all flash storage arrays, an EMC Symmetrix DX4500 storage array, a pair of VAX EV5's and various Synology NAS devices et al.



Its actually pretty ridiculous if I go and stare at it all in my Garage and in my office if I am honest about it.


I have tried to limit the costs on the Laptop and desktop front by keeping things like mice and monitors for a very long time indeed (more than 5 years).


In recent times, laptop systems, which always suffered from poor power supply issues and the peripheral bus they come with not allowing enough devices to be plugged in became an industry problem and USB Hubs were added to the pile of confusion on the laptop peripheral side of the equation.


This was really all about cutting costs of the power supply required in an average laptop.


Each OEM Laptop manufacturer adopting their own standards here has clearly not helped much and I started noticing some interesting things on the monitor side of the equation a few years ago with each different OEM FRED manufacturer I connected them to yielding results that ranged from good to bad.


Here was me thinking the IEEE would be the ones driving all these computer related standards and all to achieve the desired Universal computing devices sanity?!


So anyways, come December 2020 when Apple released an OS that would no longer be supported on my prized 15" 2012 MacBook Pro, I had some decisions to make where all these factors were ingredients in the mulling mix that had to take place while projecting the costs I was getting into here.


Abandon Apple or get a new modern one being the prime consideration of the moment.


Being an ex microprocessor guy in my past helped me understand how wonderful the new Apple ARM based silicon was and I just could not resist the potential it would and could bring.



However, with it came a learning curve that underlined serious flaws with the computer industry and the mitigating factors you have to take into consideration from a pragmatic point of view to get the best out of the various new age FREDs you may be considering for your personal use.


My new M1 MacBook Air started frying older monitors that were connecting through my vast collection of USB-C hubs and eventually the M1 itself got fried due to Apple "forgetting" what these USB-C Hubs standards have led to and the implications for new M1 based systems connected to the incompatible ones.


Then I started diving into the precise details of these various devices - the good, the bad and the ugly and came to find Apple had not been playing this USB hub game with the due diligence it deserved.


So I had to order some exotic USB-C Hubs and the matching cables to mitigate these issues after my first M1 Air got fried.


Still, the monitor issues remained a significant challenge with each legacy monitor and USB-C Hub combo on each laptop device - all having very different results and experiences with the various combinations of USB-C hubs, cables and such.


Quite a matrix of combinations was the result.


This started to irk me somewhat and grew to me becoming full on pissed off about it all to boot.


I have bought several monitors the last 3 years in the belief that they would all work with any combination of USB hub device and laptop going forward.


It got to a point where I recently acquired a new Lenovo Thinkvision 27" USB-C Monitor and some Apple 4K monitors which also had USB-C and Thunderbolt 3 capability to conclude this story and crystalize the view here for me somewhat.


I compiled the stats on which monitors worked best with what hubs for each FRED and all their various quirks (yes, there were indeed many "quirks").


I proceeded to stare at the data for a long time.


This is because I was in denial as to the realities of what the data was telling me and was actually willing it to change believing that somehow it would if I kept looking at it long enough....


Here then is the first reality to soak up and ponder on at length with modern Laptops in your home network.


Old Monitors without USB-C capability are now all junk for modern GPU in desktops or general laptop use - so junk them and get new ones that are USB-C capable.


Consider that of my 20 or so monitors that I had lounging around, NONE of them worked properly with the M1 MacBook Air!!

These new devices do not have to be 4K either, I have in fact gone back from 4K on some devices back to 2K but they are all USB-C capable for video signals.


No more HDMI!!


All the new USB-C Monitors worked with the MacBook Air M1 and the 13" MacBook Pro M1 devices.


If you have many desktops like I do (6 AMD Ryzen powered ones) you can get some extended use out of them old monitors if you have a decent PCIe 3.0 Graphics card in your compute device.


However, if you have a PCIe 4.0 GPU in your AMD Ryzen or Threadripper desktops, do yourself a huge favor - Junk your old monitors and get new USB-C based 2K/4K ones.


There is no point to strapping a 20,000 HP Jet engine to a man on roller blades!


If you have bought a PCIe 4.0 GPU you will want a matching monitor capable of enjoying its capabilities. Make sure the new monitor has USB-C video connectivity capability with a suitable cable so that your newer laptops can get plugged into these monitors as well.


The USB-C cable issue for modern monitors is a huge issue by the way. You cannot plug any old USB-C cable into your 4K based USB-C Monitor and expect things to go real swell, cause it ain't gonna.


Using your ancient Computer monitor plugged in to a PCIe 4.0 GPU is like buying a Bugatti Veyron and then hitching it to two horses when you run out of gas.


The first question you will be asked when somebody eyeballs that spectacle is WHY?

A 2 HP Deux Chevaux!!


I myself have found, care of my new M1 MacBook Air, that as long as you have the supported USB-C Hub and a USB-C supported Monitor with the right cables you will be AOK.


If you want to enhance your sado-masochistic experience by all means consider using your old monitors and unsupported USB-C cheap hubs with any old USB-C Cable.


Me, I prefer Golf for the purpose of enhancing futility and burning my time and cash.


Another reality in the new compute world is around mice and laptops as well as USB mice for desktops.


It is my experience that Bluetooth mice on laptops and desktops are mandatory these days.


Those USB-C Hubs are extensions of your laptop's communication bus and the ports are not limitless.


You also get what you pay for.


You will also notice that moving that cable between the cheap USB-C hub and your new generation laptop causes disconnects and interruptions.


As a result I have found the mouse experience with the OS is much better if it is Bluetooth 5 based.


My setups for all desktops and Laptops are now using Bluetooth meeses.

The only USB stuff I have left is video cameras and microphones.


These too will be replaced by new BT 6.0 capable devices in the near future.


I have a large collection of exotic and expensive Microphones for my Zoom sessions on all my machines which is about the only thing I plug into USB ports these days.


All my Keyboards are also the new Logitech MX series for my two Macs in BlueTooth mode or Microsoft BT Keyboards for all the windows devices (Laptops, servers and Desktops).


This has delivered a more sublime and enjoyable computing experience but I need a better answer than the Tecknet BT mouse I am currently using on all of them.


This is because I have to do weird stuff to wake them up, its not one click active like the Apple meeses are. Then again they were $10 each........


This brings up the expensive issue of sound systems.


I have a lot of Bang and Olufsen and Bose sound systems I paid an indecent amount of money for in my garage.


Sadly, these were almost all BT 2.0 or BT 3.0 based.


In our throwaway world which we live in these days you are not supposed to keep such things longer than 3 years.


However, I still own and operate serious HiFi components from the late 90's and early 2000's era.


My AR Speakers which I paid over $10K for are immortal.



That is to say I will never replace them. Above is what I would replace them with - McIntosh XR100's @ $5K each.


Sadly, when you buy expensive watches or computer gear these days you also have to buy into the concept that they are temporary from a temporal point of view.


In the digital watches case, had they been designed right they would have allowed firmware to facilitate BT 5.0 and BT 6.0 specifications via firmware upgrades.


These days everything is sold on a subscription basis.


I balked at buying a $11K Tag Heuer watch the other day because it's using obsolete tech out the gate that will make it hard to wear in 18 months.


My Current old school Tag has been strapped to my arm for 15 years now and that was just shy of $4K when I bought it.


New digital watches should have been built using a smart design that allows the innards to be replaced when they become obsolete. Are you receiving this loud and clear Frederic (Arnault) old chap!!


It would also be nice if they do not depend on Google Android OS but develop Tag Heuer Watch OS that will work with any smartphone tech.


Its the spirit of the age boy and girls, the sprit of the age!


Back to the M1 experience.......


A lot of the current software was written for intel CPU running Mac OS X variants.


The M1 has a translation layer called Rosetta that converts the x64 Intel based code to ARM speak.


It works rather well but I have noticed that several newer software updates that are native ARM M1 Code based updates are frighteningly quick compared to the old written for Intel x64 stuff.


My AI Chess based software being a case in point.


It was fast running x64 code built on OS X Sierra using rosetta to translate but native M1 code running au natural written and compiled for the M1 based ARM CPU was in a different dimension altogether.


I just completed tests with 4 x M1 MacBook Pros running my ARM based code beating up IBM Watson setup for Chess AI.


This capability is seriously impressive.


Watson does not have 8 GPU like the M1 has.


Neural Network stuff runs like it was built for these ARM Cortex based CPU.


The result is cheaper and much faster silicon that also runs way cooler.


I also just finished some high level tests using Pixelmator Pro software that was also natively coded for the new M1 ARM silicon.


Boy, what a difference!!


For shits and giggles I also got hold of a copy of the new Blizzard World of Warcraft 9.0.2 software and I gotta tell you its a whole new gaming experience!!



My pals in the gaming world tell me they are now frantically

coding new native M1 games like a trillion elves in Santas new Workshop.


Apple have hit a solid gold mine with this thing!!


We even hooked up a new Samsung 8K monitor and got ourselves some serious WOW this past weekend.


I will bet big bucks that new game stations from the usual suspects will all be ARM based as a result of these impressive gains.


The future has arrived, but beware the pitfalls.


Out with the old gear and in with the new!! Pronto Pronto por favore!


Grazie Infinite.......


















bottom of page