About macOS GPU Drivers - from Apple
For those that may have missed, the Apple eGPU document was updated about a month ago to add information about "macOS GPU drivers." I've pasted the excerpt here:
About macOS GPU drivers
Mac hardware and GPU software drivers have always been deeply integrated into the system. This design fuels the visually rich and graphical macOS experience as well as many deeper platform compute and graphics features. These include accelerating the user interface, providing support for advanced display features, rendering 3D graphics for pro software and games, processing photos and videos, driving powerful GPU compute features, and accelerating machine learning tasks. This deep integration also enables optimal battery life while providing for greater system performance and stability.
Apple develops, integrates, and supports macOS GPU drivers to ensure there are consistent GPU capabilities across all Mac products, including rich APIs like Metal, Core Animation, Core Image, and Core ML. In order to deliver the best possible customer experience, GPU drivers need to be engineered, integrated, tested, and delivered with each version of macOS. Aftermarket GPU drivers delivered by third parties are not compatible with macOS.
The GPU drivers delivered with macOS are also designed to enable a high quality, high performance experience when using an eGPU, as described in the list of recommended eGPU chassis and graphics card configurations below. Because of this deep system integration, only graphics cards that use the same GPU architecture as those built into Mac products are supported in macOS.
Which clearly means no NVIDIA until a modern NVIDIA GPU is used inside a Mac.
Basically reiterates in documentation what their actions have already demonstrated. Apple wants to lock down MacOS to make it more like ios/ipados, so no 3rd party code is welcome anymore in their kernel space. Its why the new driverkit was implemented to keep usb peripheral drivers out of the kernel and why no web drivers were ever released for Mojave. As for only supporting chips in existing macs, the Radeon VII sortof counters that a little bit, tho I guess the MacPro has been in development since before the RVII was released. Be nice if they put prerelease drivers in for Navi, though its not clear if Apple will even use Navi in any Macs.
Interesting to see them state that so clearly... so we are not going to get Navi drivers anytime soon unless they release Macs with them - the iMac Pro is the obvious candidate but as I mused in the event thread, I feel Apple has double downed on Vega with the Mac Pro and as a result we will see various iterations of similar cards in the Mac line up for the next year or so.
I may be adding 2 and 2 and getting 57 but... I am connected with a couple of Apple folks on Linkedin and have been seeing job roles that make me suspect Apple are working on their own GPU's which IM(limited)O may make more sense than switching to ARM CPU's.
It would not be a stretch for Apple to consider their own GPUs for Mac. The biggest limitation is the TDP (by virtue of available chassis/cooling and power) they are willing to work with (speaking for MacBook Pro). IIRC the discrete GPUs in the MBP never exceed 35W - a very low power envelope to work with. Other laptops with GTX 1050 Ti for instance may consume upwards of 60W for the GPU alone.
@mac_editor Thank you for sharing this snippet. At least we know what to expect from Apple. I also agree with you about Apple’s way of designing its Mac systems. They have kept the laptops under 100W TDP for a long time. Come to think of it, that’s why the PowerBook G5 never happened!
External GPU presents an outside the box solution so it’s unclear why they haven’t lifted that restraint. Desktop GPUs from the green team have been more efficient than the red team in recent years. IMO there’s still non-technical reasons Apple refuses to engage with Nvidia.
Apple main selling speech is security.
How one achieve security? Simple answer, remove any possible way for the users to modify the machine, this include the software and the hardware.
The T2 chip and the new code are an example of this security needs, blocking any external attempts to mod the device.
Which also mean that apple is the sole owner of its customer devices and data, especially if the devices are quite immune to agency backdoors and other planned 0 day OS exploits.
This put apple in an advantageous market position compared to others OS and hardware manufacturers, which have obviously at the time of speaking, some 0 day exploit carefully hidden.
Long story short, no more hardware swap, each piece of hardware contained in the device have an unique ID that is linked to the T2 chip code and also to the OS.
One can argue that the software mod would not even be need anymore, if one can't even use other hardware within the device other than what it was predesignated for.
To give you the idea of what apple ideal device is, it should be like a tissue, easily thrown in a bin as it is to take a new one.
As for Nvidia, i mean Mr leather jacked would ask Cook even its pants, providing RTX, CUDA, Quadro stuff for apple macbook pro line would give a huge boost to apple sales.
Which mean a real real tons of money, that apple is not willing to lend to Nvidia, because it obviously give Nvidia an even egregious market share, earning advantage.
Big companies are very very friendly when it come to make money and please shareholders, but there is a limit i suppose where the friendly tchitichat end.
When it come to total market monopoly, i suppose one would be a bit more careful to not feed the already dominating market monster. ^^
About Navi support, a simple question to ask yourself: Do apple need Navi at the moment??
The answer to this question will also shed light on the future gpu support, as said previously i would not believe in apple god will, but i would be glad to be proven wrong.
Indeed. I believe they have gone for consistent but lower performing designs. Note how modern comparable Windows laptops typically require a power brick to attain maximum performance. On Mac notebooks, performance is identical irrespective. The primary problem is that CPU innovation has not kept up with Apple’s requirements (or vice versa, Apple is unable/unwilling to adapt). For a 45W Intel part, Apple designs a thermal system capable of handling 45W. Combined with a 35W GPU, and things go south fast. What they should be doing is build a system that handles far more than that, since Intel boosts far higher. But then going sleek is a challenge, plus meeting the bar of running the CPU at base clock is A-OK, as technically this isn’t thermal throttling. The 2019 MBPs for instance seem to come factory undervolted, hence the improvements in thermal performance. The best course of action would be custom silicon - they already have a lot of the system management components, ISPs, and even video encoding with T2, so might as well add some CPU cores haha.
Well i would not be surprised if the T2 chips is nothing else than a modded Ax version of the main processor apple use in the iphone. ^^
I mean, at the moment the T2 chip have taken control on almost everything in a macbook and even some functions that were dedicated to the PCH.
Instead now, i suppose apple control the whole computer initialization when the machine boot, therefore it need Intel for the sole CPU/IO job.
A lot of signals that were routed to the PCH are now routed to the T2, if i believe the few glimpse of schematics i saw.
This include, as you said, the system management, power management, data management.
Indeed. I have some other gripes with the T2. The T2 ISP is horrible. My 2014 MBP had a better camera than my current 2018 model. The video encode/decoder has questionable quality in some scenarios, and there is no good API to make optimal use of the capabilities (the existing one has extremely limited functionality). At least I want a better T-chip haha.
I would suppose that the apple security strategy success lie into how features are developed, i mean if it just works.
If advanced users strongly feel the need to optimize features, it would mean that at then end the strategy will tend to crumble.
If one remove the needs to optimize because the features already give you the best, at this point apple can really close all doors successfully.
I did not even known that the T2 chip would be used to encode/decode, but rest assured that with actual and further silicon nodes, apple could really build a really powerful co-processor managing everything if it wanted to. ^^