Member-only story
Apple’s WWDC 2023 Keynote Highlights Innovation In Machine Learning
It’s not just about the new VisionPro AR headset.

When you think about tech companies and their role in advancing machine learning (ML) and artificial intelligence is Apple on your list? Let’s be honest, the biggest and splashiest research in the last eighteen months has come from companies like OpenAI (the GPT family of models, DALL-E 2), the Google Brain Team (Bard, Imagen), and Meta (SAM, LLaMA). But after watching Apple’s WWDC 2023 Keynote and thinking about how Apple applies ML in their software I’ve decided they deserve a spot on the innovators list as well. Let’s go over some of the announcements from the keynote and I will explain my reasoning. FYI, there’s a lot more to it than the reveal of the Vision Pro.
New Mac Hardware
Apple has always had strong ties to the liberal arts and their new lineup of hardware reflects that continued relationship. New models of Mac Studio and Mac Pro were announced that can be configured with Apples latest M2 Max and M2 Ultra chips. Clearly these machines are being marketed primarily to companies in the media industry. They said as much as they went over performance enhancements in video editing software like Adobe After Effects and dropped names of high profile customers like NBC’s Saturday Night Live.
But there was also a brief moment where Apple boasted about the new M2 Ultra chip and applications in ML.
… in a single system it can train massive ML workloads like large transformer models that the most powerful discrete GPU can’t even process because it runs out of memory.
For those not entrenched in the latest GPU specs, NVIDIA’s top of the line consumer GPU, the RTX 4090, has 24GB of VRAM. When put along side a maxed out M2 Ultra with 192GB of unified memory (shared across all compute tasks) it’s clear you can train larger models with these new machines, but was Apple trying to dangle a carrot to ML practitioners?
There is a lot to unpack here. Just three years ago Apple was still relying on Intel for their CPUs and AMD for their GPUs. I had contemplated using my iMac and an external GPU for ML at that time, but with little software support I ended up building a custom…