Why is the iPhone just now getting USB-C?

THE HISTORY OF LIGHTNING, APPLE’S secretive MFI program explained, AND THE REAL STORY behind APPLE AND USB-C

With Apple's Wonderlust event today, the iPhone 15 lineup (and AirPods Pro 2) have finally switched to USB-C. But as much excitement as there was about the change in the time leading up to the event, there were also complaints about rumored limitations Apple was going to implement on both charging and data transfer speeds.

We now know the data speed limit has a straightforward explanation: only the A17 Pro chip contains the sub-processor necessary to allow USB 3[.1] transfer speeds up to 10 Gbps. And contrary to what you may read, this is pretty in line with industry standards, with only the Android phone supporting 10Gbps being the Asus ROG Phone II. And because Apple uses previous-gen chips for their non-Pro models to reduce cost, it just wasn’t possible to support 10 Gbps on the base iPhone 15 model.

When it comes to power delivery, Apple specifies that the iPhone 15 models will still be limited to 20W fast charging, although theoretically it can go up to 30W. This is definitely slow by today’s standards when it comes to Android phones, some of which support up to 150W, but more on that later.

Aside from frustration about the potential limitations, speculation was abound about why Apple was only making the switch now—is it really because of the EU laws? Or is there something more to it?

Regardless of the truth behind Apple’s choices, it’s tradition at this point for people to target them for the perceived injustices towards their customers. Comments like: “Apple just wants to sell Lightning cables!”, “Apple is always so far behind!”, or simply "Apple sucks!" are commonplace. News articles opt for flashy headlines and often don’t delve deeper—and Apple, of course, remains silent about its product engineering choices and how the company operates internally. Few know about the MFi Certification Program, and even fewer know about how it works.

This has bothered me for quite some time, and so I began writing this blog post when the rumors around USB-C limitations on the iPhone 15 lineup first started to surface. I wanted to shed some light on the history of the Lightning connector, Apple’s MFi program, their relationship with USB-C—and how the thread that runs through all of that has driven their product choices over the years. And I believe once you have the complete picture, you’ll see those perceived injustices are anything but.

Wait, who are you?

While I won't bore you with all the details, the key information you need to know is that I've been a software engineer with a focus on the Apple platforms for almost 15 years now. And although I've never been directly employed by Apple, I did work for a well-known US company between 2012 and 2015 that manufactures charging cables, battery packs, and other accessories for Apple products.

This puts me in a position to offer some insight into this topic. Most of what I discuss here is more or less publicly known at this point, so hopefully I won’t get an angry letter from Apple's lawyers. The goal isn't to expose the internals of the MFi program, but to connect the dots across the history of Apple's products and their connectors, so you can understand the reasoning behind the choices they've made.

I'll provide as many references as possible—but when it comes to anything about the MFi program, that information generally isn’t available online. For security, documentation on Lightning is only shared out to partners in the program, which is watermarked for that specific company to deter it from getting out. So some of what I share here you’ll just have to take my word for.

Let’s begin by addressing the Lightning connector…

A History of USB & Charging

In order to better understand Apple’s reasoning behind creating the Lightning connector in the first place, we have to travel back in time 13 years (yikes, has it really been that long?) to when Apple first began developing it: 2010.

In 2010, USB-3 spec devices were barely hitting the market. Mostly everything was on the USB-2 specification. Moreover, the power specification at the time for charging over USB was limited to just 7.5 watts. And obviously, a reversible connector wasn't something anyone else was even thinking about.

USB-A to Micro-USB cable

So with the need for a slimmer connector than the 30-pin to fit upcoming iPhone designs, and disliking micro-USB like the rest of us, Apple began developing Lightning. But it wasn’t just about design, they knew Lightning had to be forward-thinking enough to work in all their devices for—in their words—at least 10 years. And that required building a connector that packed way more capability than anything else that existed at the time.

Lightning: Not Your Parent’s Serial Port

There’s plenty of articles that dive into the technology of Lightning, so I’ll keep this as brief as possible. The connector itself contains several tiny chips packed in a very small space, which is impressive in its own right. Aside from validating it’s authenticity as a legitimate lightning connector, this allows it to support a variety of additional features:

  • Charging

  • Data at USB-2 speeds (480 Mbps)

  • VGA

  • HDMI

  • Audio (with a on-board DAC). At the time this might have seemed unnecessary, but as we now know, this was in anticipation of killing the headphone jack.

Closeup of a disassembled Lightning connector

In order to support all this with only 8 pins (6 if excluding power), the connector needed to be able to handle tons of different types of data and route it appropriately to whatever was on the other end of it—some of which, like HDMI connectors, have way more than 8 pins. Nothing like this existed at the time. The only thing that came close was Intel's Thunderbolt, which was still in development.

It’s probably obvious to those who’ve used Apple products for a long time—but can you guess who Intel’s developing partner on Thunderbolt was? Yup, Apple. Thunderbolt & Lightning!

Those tiny chips were engineered well enough that updates on the host device would allow the connector to support new or updated protocols, without needing to design a whole new connector. In addition to allowing Apple to support updates to the HDMI spec, or new adapters entirely like Ethernet, SD cards, etc—it also allowed them to support changes in USB specifications.

And what new USB specifications would be important to support? If you guessed faster charging, you are correct. Apple not only knew that their devices would become more power hungry as time went on, but they knew USB-IF was working on the first proper USB Power Delivery specification, which would allow up to 100W of power delivery. And that specification was only due to be finalized just before Lightning would debut on the iPhone 5 in mid-2012.

Knockoffs: Handbags, Charging Cables, and Malware

As everyone knows, certain countries love to knock off high-end products. While this may be fine for the $50 "Gucci" handbag you bought your partner for Valentine’s Day that they secretly know you couldn't possibly have afforded, it's a big problem when it comes to technology.

At the company I worked for, we had a task force that worked daily with US customs in a constant game of whack-a-mole trying to block imports of counterfeits of our charging accessories to the US. Not only could those counterfeit devices hurt our brand image and be a nightmare for our support team—they could possibly damage your phone or worse: cause it to catch fire and burn your house down.

Police raid a shipping container containing counterfeit goods (source: Greater Manchester Police)

As if that wasn’t enough, Apple knew that just allowing any device to connect to an iPhone could be a security risk. There were already documented cases of bad actors selling thumb drives, battery packs, and even charging cables with embedded malware that could compromise the device they’re plugged into, or the data being transferred over wire. While you can mitigate this with software on the host device, it is much more robust to block the connection from being established in the first place by embedding a means to do an authentication handshake with the Lightning connector itself.

This mechanism also enabled Apple to continue requiring third-party accessories to go through its MFi Certification program—which was introduced alongside the 30-pin connector on the iPod all those years ago. MFi, at its core, was a way for Apple to ensure third-party accessories adhered to the technical specifications of their connectors, worked well for the end user, and didn’t do anything that may damage or otherwise compromise the security of their devices.

You can think of MFi as something like the FDA—before you can sell a new drug or food product, it has to be reviewed to ensure it isn’t going to kill anyone or make them sick.

Once a manufacturer got an accessory approved by Apple, they could then bulk-order male (and only male) connectors to use on their devices. In addition, should Apple later find that a previously-approved accessory was causing egregious harm to one of their devices, there’s a mechanism in place that allows them to retroactively block it from connecting.

You may have experienced this yourself when a cable you’ve been using suddenly stops working after an iOS update—seeing a cryptic “This accessory may not be supported” error message on your iPhone. How this works isn’t exactly known, but we can assume that Apple tracks which batch of Lighting connectors are sold to each manufacturer, and through an OS update, can revoke those Lightning connectors from authenticating with their devices.

Later, Apple found that even all of this wasn’t enough, as Grayshift found a way to crack iPhone passcodes via hardware connected with first-party (Apple) Lightning cables. Which is why a few iOS/macOS releases ago, they started requiring you to enter your passcode before the device will even begin the connector verification step.

MFi: A Money Grab?

By now, I hope it’s a little more clear why Apple has the MFi program and insists on requiring manufacturers to get approval before they are able to sell accessories that use the Lightning connector. But maybe you still aren’t convinced…you might still be thinking “whatever, it’s just a way for them to sell $30 Lightning cables”—so let's dive into the cost side of it.

While I agree $30 is egregious for most cables, you can find dozens of third-party alternatives that don’t even come close to that price. And those exist because Apple, through the MFi program, allows third-party manufacturers to buy genuine Lightning connectors from them—who can then turn around and charge whatever they want for their accessories. And so long as the connector wasn’t overly priced, manufacturers have a big incentive to make products for the Apple ecosystem.

At the time I left the aforementioned accessory company, the cost a certified Lightning connector was around $2 for charging only, and $4 for data+charging. I’m not sure what the price is now, but with inflation it probably has stayed the same despite likely getting cheaper to manufacture.

While $2-$4 is a lot considering it costs about that to manufacture a whole cable, at the time Lightning was introduced there wasn’t anything like it. So we don’t really have something to compare it with to know for certain whether Apple was making some kind of profit. But given all the technology crammed in such a tiny space, and the cost of running the MFi program itself, it’s unlikely.

In keeping with the FDA analogy, just the first step of getting a single product approved for sale can be upwards of $30,000—which goes towards the operating costs of the FDA. By comparison, the MFi program only costs $99/year per company. So even if the $4 is heavily marked up from the manufacturing cost, any profit likely goes towards running the MFi program.

So we’ve covered why Apple made the Lightning connector and why the MFi program exists, but you might be thinking: Thanks for the history lesson, but what does this have to do with their transition to USB-C?

USB-C: the everything connector

The USB-C 1.0 specification was finalized in 2014. It borrowed many of the features that the Lightning specification had: variable power delivery, optional data transfer with support for multiple complex protocols, and an optional DAC for 3.5mm audio adapters. Oh, and of course, it was the first USB type with reversible connector design.

However unlike the Lightning connector, which only came in 2 configurations: power or data+power, support for features depends on what flavor of the specification the manufacturer wants to implement—and therefore USB-C cables & devices vary widely in capabilities.

Matrix of protocol support for various USB-C modes (source: Wikipedia)

This was by design—the USB-IF wanted USB-C to be backward compatible with previous USB types and specifications. But they also wanted to support advanced data transfer protocols like Thunderbolt.

Wait, what’s the USB-IF Again?

I’ve mentioned USB-IF a few times now, so it’s probably good to give a basic explanation of what it is exactly. USB-IF (USB Implementers Forum) is a non-profit consortium funded by tech companies—big and small—to design USB connectors and the specifications that enable them to power devices and transfer data. It was created back in 1996 to create a universal serial port design—when nearly every mobile phone had a different charger, and ports on computers began to deviate from the already few standards they had.

Notably, the USB-IF board seats representatives from both Apple and Intel—who, as mentioned previously, designed the Thunderbolt specification together. So as you can imagine, both had a hand in designing the USB-C specification—with Apple seeming particularly involved with the project. This is important context to remember as we continue to explore Apple’s switch to USB-C.

The Power of USB-C

USB-C specification 2.1 supports DC current up to 48V at 5A, equaling a massive 240W of power delivery. To put that in perspective, most laptop charging bricks are 65W, and most phones average around 30W.

This reason behind most devices not supporting the full 240W is two-fold—First, the more power you want to deliver, the larger the AC-to-DC converter or “power brick” has to be in order to dissipate the heat caused by AC-to-DC conversion, so it doesn’t burn your house down. And second, the more DC current sent to your smart device, the greater risk there is of damaging the battery or the device overheating and catching fire.

So how do you prevent that massive amount of power being sent to devices that could not thermally handle it?

Well, a simple solution would be to use a smaller charging brick that only outputs a low level of DC current. But as USB-C ports are now almost universally used, there’s nothing keeping you from plugging your iPhone into your 100W MacBook Pro charger. So there must be more to it…

USB-C, an Open Controlled Specification

USB-C specification allows connectors to be either passive or active, typically power-only USB-C cables use passive connectors—so it’s up to the devices on either end to figure out how to communicate with one another. Just like the Lighting connector, active USB-C connectors pack a ton of silicon into their tiny size—especially the cables that support Thunderbolt and all the sub-protocols it supports like HDMI and DisplayPort.

With that complexity it’s imperative active USB-C devices adhere to the specification—and just like Apple's MFi program, there are organizations that certify manufacturers correctly implement it. Getting a product certified also allows the manufacturer to use the official USB logos on the product packaging.

Disassembled USB-C Thunderbolt cable (source: 9to5mac)

But most of this isn’t new to USB-C, all previous USB specifications also had some form of power delivery management and ways to determine what data the device on the other end could handle. The difference though, is that nearly all previous USB connectors/cables were passive—meaning it was entirely up to the host device (for example, your computer) and the child device (your mouse) to figure out how to talk to one another.

This communication between charging bricks, cables, and devices allows the three to determine the optimal amount of DC current to send over the wire to your device, with the receiving device dictating the maximum current to take in. For the iPhone, both with Lightning and now with USB-C, Apple decided to limit that intake to 20W. Again, this is likely due to thermal constraints (not wanting to burn a hole in your pocket)—not because Apple can’t, or doesn’t want to, make a device that can charge faster at higher a wattage.

iPhones also have smaller batteries because of the efficiency of iOS and the Apple-designed silicon inside—so comparing the raw wattage number doesn’t necessarily indicate how quickly an iPhone can charge. And more importantly, charging at lower power levels extends battery life—which has been important to Apple, especially after all the fuss around battery degradation and difficulty replacing batteries in older models.

Lightning and USB-C: Same same, but Different

Lighting and USB-C are already seeming pretty indentical at this point, but there’s even more similarities we can draw:

There are "knockoff" USB-C accessories that either don't follow specification—or try to, but don't care enough to verify with a certification program everything is working correctly. For passive power-only USB-C cables that’s probably not going to be that big of an issue, but it’s still a good idea to check for certification logos on the packaging when purchasing USB-C accessories.

There is also the threat of malicious actors embedding malware into USB-C connectors, but with so many companies creating USB-C cables and devices, it's unlikely you’d stumble upon one. To mitigate this, in 2019 USB-IF introduced a USB-C Authentication Specification—taking another page directly out of Apple's MFi playbook. This allows a manufacturer to embed a certificate on their active USB-C accessories that can be challenged by host devices to verify its authenticity before connecting, or vice versa.

So if the ports are so similar, and Apple was so involved in USB-C’s design, then…

…Why is the iphone just now getting usb-C?

At the beginning, I explained I was frustrated because I believe the common complaints about Apple’s product choices exist only because people don’t have the full picture. And that isn’t anyones fault—a 2 paragraph news blurb just doesn’t contain enough information. And news organizations can generate more clicks when they paint any Apple decision as a money grab, or them just wanting to protect their “walled garden”.

And in a way, it is true they want to protect their “walled garden”—but not out of malice. Hopefully I have demonstrated that without some level of protection and certification, both Lighting and USB-C connected devices are at risk of mismanaging power and overheating, could allow for malware, or just flat out not work at all.

Apple and its partners pioneered all-in-one ports like Thunderbolt & Lightning—the engineering of which was no easy task. The restrictions built into these specifications were a necessity to build a safer, more robust, and versatile connector than anything offered by USB specifications at the time.

Apple didn't keep it all to themselves either—they took what they learned from Lightning and contributed significantly to the design of USB-C. When it was ready, they were one of the first to widely adopt USB-C across many of their devices (which, ironically, also got a lot of blowback that the time). With all that time and money invested, it was always the plan to switch the iPhone to USB-C.

During the launch event of the iPhone 5 in 2012, when Apple first revealed the Lightning connector, they declared it would be the standard on their devices for at least a decade. With the launch of the iPhone 15 today, having waited just over 10 years as promised, Apple transitioned their flagship device to USB-C.

And so in conclusion—

—hang on, you never talked about the EU legislation

Oh, right.

If you read Apple’s objections to the EU law in 2021, they weren’t against USB-C altogether—their concerns revolved around much of what we discussed here: the complexities of the USB-C specification that most consumers would know nothing about. They argued that without the bill requiring USB-C certification of accessories and clear product labels—consumers were at risk of purchasing potentially dangerous cables, or ones that don’t support data transfer, and then wondering why their phone isn’t connecting to their computer.

Apple also wanted to be able to sell their older models with Lightning connectors in the EU for those people who don’t care to pay a premium for the latest iPhone model. There are many sub-models of each iPhone generation depending on the region, mostly due to different cellular chip requirements. The legislation as it was drafted in 2021 would’ve resulted in Apple having to toss the previous generation EU-region iPhones in a landfill, or tear them apart to try and recover some of the recyclable materials—resulting in a lot of e-waste from perfectly good phones.

Furthermore, while that legislation was in the works for a while, it was only approved by the EU Parliament towards the end of 2022. Apple, and its partner TSMC, spend years developing their silicon chips. The A17 Pro—with the transistors to support 10 Gbps USB-C—was not pulled out of thin air in less than a year or even 2 years.

Can we agree on one thing?

I hope this article has been informative. If you have any comments, or if I didn’t get the facts right on something I’d love to hear about it below.

Regardless of whether I’ve convinced you on anything today—I hope we can at least agree that it’ll be nice for our backpacks to be just a little lighter, with one less cable to carry around.

Previous
Previous

Introducing SwiftLAME