Zeroing in where security hinges

iPhone certificate flaws

iPhone PKI handling flaws


The iPhone is obviously a consumer market product which was later enhanced to become an enterprise device. Unfortunately, it seems Apple messed up their corporate-oriented functionalities, ending up with something that proves to be hard to integrate in a public-key infrastucture in any secure way.

The following page summarizes our findings in terms of chain-of-trust management on iPhones, describes a major security flaw and how we could cope with the current situation (Jan 2010).

iPhone provisioning protocols

iPhones currently provide two provisioning protocols allowing to install certificates on a device. v2 was the version released with iPhone OS v2.0 and v3 released with iPhone OS v3.0.

iPhone OS v2

This protocol is quite straightforward: put an XML config file named something.mobileconfig served with filetype application/x-apple-aspen-config somewhere on a web server seen by the iPhone, point Safari to the corresponding URL and let it download the file.

XML configuration files are created with an Apple utility called the iPhone Configuration Utility (iPCU), which is a desktop-based program running on Windows or Mac OSX. Apple has not released specs for the XML config files it produces.

iPhone v3

This protocol is an attempt from Apple to streamline over-the-air provisioning to large numbers of iPhones. It is described in: Enterprise Deployment Guide

Provisioning an iPhone in v3 is done through several network exchanges:

  1. iPhone accesses URL of provisioning server (hereafter: PS)
  2. PS responds with a minimal mobileconfig file requesting credentials
  3. iPhone POSTs a request to PS containing its signed credentials
  4. PS responds with key specifications and the address of a SCEP server
  5. iPhone performs SCEP request to SCEP server
  6. SCEP server delivers a certificate


There are several shortcomings to that process:

Certificate fail

In step 3, the iPhone signs its own credentials (including its IMEI or device serial number) using an Apple-signed certificate. To validate this certificate, the chain of trust must be established up to Apple’s root CA. Unfortunately, Apple does not provide access to this chain except by jailbreaking an iPhone and extracting it directly.

The following chain of trust was manually extracted from a jailbroken iPhone:

Signed requests from this iPhone use this key:
    CN=Apple iPhone Device CA
    issued by CN=Apple iPhone Device CA

The certificate for 'Apple iPhone Device CA' is:
    CN=Apple iPhone Device CA
    issued by CN=Apple iPhone Certification Authority

The certificate for 'Apple iPhone Certification Authority' is:
    CN=Apple iPhone Certification Authority
    issued by CN=Apple Root CA

The certificate for 'Apple Root Certificate Authority' is:
    Serial Number: 1 (0x1)
    CN=Apple Root Certificate Authority
    issued by CN=Apple Root Certificate Authority

The last certificate in the chain is a self-signed root CA for Apple Root Certificate Authority.

Interestingly, the Apple root CA on top of the iPhone chain is not the same as the one published on the Apple web site. Fetching the root certificate published on Apple’s web site shows:

    Serial Number: 2 (0x2)
    CN=Apple Root CA

Different name (CN), different serial numbers (1 vs 2) but the same key id. It looks like somebody reused the same keyset to generate a second certificate. Hard to tell whether this is an oversight or intentional, but the fact is: you cannot technically relate an iPhone signature to the Apple root CA certificate published on their web site. Even with the same keyset, verification will fail because Subject and Serial are different.

SCEP fail

It looks like the iPhone SCEP client implements an old (draft) version of the SCEP protocol. As an example: sending back a chain of trust containing several certificates will lead to an error, the iPhone only accepts one certificate upon request of the CA chain. If you need to talk to a SCEP server, make sure it will accept old-fashioned requests.

mobileconfig fail

As seen above, installing mobileconfig files can happen over the air through v2 or v3 protocol. It is also possible to connect the iPhone to a desktop running iPCU and use it to transfer mobileconfig files through cable.

An interesting difference is that profiles downloaded over the air are not trusted by default, whereas profiles downloaded through iPCU over a cable are trusted. This translates into a red icon for non-trusted profiles and a happy green flag for trusted ones. As demonstrated below, trust does not depend on the medium being a cable or over-the-air download.

A close study of iPCU revealed that:

  • iPCU generates its own set of keys upon install, and self-signs its own certificate
  • Whenever a new iPhone is connected to that iPCU instance, iPCU inserts its own certificate into the iPhone trusted keystore.
  • Further exchanges between this iPCU instance and a known iPhone are always trusted, as long as the iPCU certificate is present in the iPhone. This is also valid for mobileconfig files sent over the air: as long as they are signed by a trusted iPCU, they are trusted upon download.

An even closer study of the certificate used by iPCU revealed that it only contains Signature in key usage. This lead us to discover a serious security flaw as described below.

Security flaw

What was found

We observed that iPhones will trust mobileconfig files they receive over the air or through wire if they are signed by a trusted entity. However:

  • The keystore used to lookup trusted CAs includes the default Safari keystore
  • A signature-only certificate is enough to sign mobileconfig files

There are 224 trusted root Certificates in the iPhone keystore (v3.1). See: for a complete list published by Apple.

It is relatively easy to obtain a signature certificate from many of them without any sort of verification. A demo (test) signature certificate can be obtained from Verisign without need for anything other than a valid e-mail address (throwaway addresses work, too) for sixty days at no price and without providing any credit card details.

NB: Verisign is not to blame for this in any way. They distribute un-verified temporary certificates that you are not supposed to trust for anything, like most other certificate providers.

What was tried

  • Create a throwaway e-mail address
  • Use it to request a demo certificate from Verisign Level 1 for a person named Apple Computer, valid for sixty days
  • Create a mobileconfig file on iPCU: name it Security Update, declare it as issued by Apple Computer. Export it to disk without signature as a plain XML file.
  • Using openssl smime and the P12 you got from Verisign, sign the mobileconfig file including the complete CA chain and put it onto a public HTTP server
  • Open the link from Safari on iPhone and observe that the configuration is trusted by the iPhone.

Edit 2010-02-04: Demonstration file taken away. Point was made

On an iPod Touch, the installation screen looks like this:

Apple security update

Downloaded mobileconfig file

To be successful, profile installation needs to be validated by the end-user. Unless they know about this flaw it is quite likely that a default end-user would trust an update that claims to be issued by Apple and indicated as trusted by the device. A bit of social engineering is needed to both get the user to click on the link and accept the profile installation.

Exploiting the flaw

Parameters that can be set through mobileconfig on an iPhone include root certificates. Modifying root certificates makes it possible to act as man-in-the-middle to hijack SSL (HTTPS) connections.

Obnoxious modifications can be brought to the phone like prohibiting the use of Safari, mail and other apps, or adding extra VPN, WiFi or e-mail settings. It is also possible to set up the profile as being non-removable by the end-user, which would force the iPhone owner to wipe it clean to remove the profile.

What could be done

There is absolutely no reason for an iPhone/iPod to trust root CAs for over-the-air mobileconfig downloads. Apple needs to define who should be able to download mobileconfig files onto a device, be it an end-user or a company, and devise a correct way to share keys between the device and its associated provisioning server.


Written by cryptopath

2010-January-29 at 13:41

22 Responses

Subscribe to comments with RSS.

  1. Interesting findings, and with Apple’s track record it’s not surprising to see their provisioning mechanism leaves a few things to be desired.

    I have looked into the implementation myself, not specifically investigatng the security of it, but the details required to get it working for a customer scenario. Based on this I’d like to add my two cents 🙂

    I think I understand the reasoning behing iPCU = trusted while OTA = non-trusted. Connecting with USB is something the user most likely are aware that they are doing, while “anyone” could send an OTA profile. And in this sense the USB cable is probably deemed a “more secure” medium by default. It’s possibly not a good distinction, but I can live with it. The issue with trust is of course as you point out, that “Verified” by Apple terms doesn’t really equal what others might consider “verified” (as in comes from the source you believe it comes from).

    That being said, there are potentially different certificates in play in a company’s implementation of a provisioning server. I do not know why Apple allow provisioning over HTTP – most likely to make it easier to implement for those who don’t feel like configuring SSL on their web server. If SSL is required for this connection you’ll have an extra check in place here. Yes, I realize that the same way that you can issue fake signature certificates you can try to circumvent this barrier for SSL certs as well. And while the social engineering aspect is always present, Apple could at least try to include some information in a dialog box when connecting to the provisioning server to try and make it more or less likely that the profile is trustworthy. Trusted or not the user has to accept the profile, and it cannot be silently deployed.

    When I tested I generated unsigned profiles with iPCU and signed these with a certificate issued from my own PKI (I used the “IPSec Offline Request” template). This profile when provisioned, correctly identified itself as “Unverified” since my own root CA is not trusted on the device. However I was not able to apply this policy without the device going through the SCEP enrollment process even if I accepted it. (I do not know if this is due to the implementation I’m using, or if this is the proper design.) I haven’t tested with a verified profile – are these able to be applied without a CA/SCEP? While it doesn’t require advanced skills to install and configure a SCEP server it raises the bar for being hijacked by script kiddies.

    The SCEP protocol does not, (as far as I can tell), require you to use HTTPS for the enrollment, but you can force SSL on your SCEP server. Of course that doesn’t help as long as the client side really doesn’t care, and if I’m setting up a spoofed SCEP server it’s not like I’ll go through steps hardening on the server side. I based my studies on draft 20 of SCEP, but I think Microsoft based their newest implementation on 17 or 18, so it may very well be that Apple are a few versions behind too. (I do not know right off the bat what the differences are.)

    The SCEP protocol recommends using one-time passwords for enrollment that need to be provided to the user out-of-band. With a Microsoft CA you specifically need to disable this requirement with a registry key. Once again, this could also be fixed through social engineering where the user is informed that they need to go to a web site to pick up a code. The verification of the OTP is only done on the server though, and not communicated back to the device so I guess you could just fake this part too.

    When the signed profile has been installed on the device it cannot be removed by the end-user on-device or through iPCU so I believe Apple got that part of it right 🙂 OTA provisioning is after all meant for enterprises that need to enforce security policies, not end-users trying to make things easier for themselves. End-users should just stick with iPCU,

    I tested it with a Windows Server 2008 R2 CA, and no adjustments had to be made for allowing the requests as such. I don’t have the logs available at the moment so I cannot say for sure if the request was done with HTTP POST or GET, but then again I don’t see that making a difference 🙂 The certificate chain has three certificates, and this was no problem. (I didn’t debug the traffic to see exactly what the iPhone did to handle this.)

    Depending on your SCEP server the certificate request coming from the client should be signed with the public key of the SCEP server, so as to ensure the proper CA issues certs as indicated by the profile.

    Still, it comes down to the fact that users are stupid, for lack of a better word. If you are the type that will happily install browser pop-ups saying they’re security updates from Microsoft you may very well accept “security updates” on your iPhone coming from Apple. Yes, I concede that you have to be more tech-savvy to understand that this is a concern on mobile devices. On the desktop side of things people are starting to learn.

    So what can Apple do? The OMA DM protocol, (which tries to solve the same base problem of provisioning policies), needs the user to accept the bootstrap message with a user pin (could be provided in a separate sms). The user then has to type in the first four digits of the thumbprint of the SSL certificate on the server. Not very user-friendly, but it works. It’s still possible for someone to send out spoofed OMA DM profiles as well too though. The OMA DM protocol has other issues as well, so I can understand Apple not going down that route initially. (It would have been less easy to be proprietary too that Apple usually has a penchant for being.)

    Apple should definitely reduce the number of trusted root CAs however. The Safari keystore is not a good keystore for profile signing certificates. Creating a scenario where no profiles can be verified by default is not a good scenario because that makes zero-touch provisioning/deployments harder. Perhaps they should remove the concept of Verified/Unverified altogether since verified currently does not have the value it should.

    Apple should also enforce SSL and certificate checking in other steps of the provisioning process.

    But the user side of things? I send out my provisioning links via SMS. Yeah, I can configure “Apple” as the sender id, but shouldn’t the user think twice before believing info in unsolicited SMS messages? And shouldn’t users think twice before accepting a profile they weren’t expecting? There’s always the security hole that users might do things they regret in hindsight.

    You could have companies wanting to provision their devices having to register with Apple, and have the bootstrap sent from an Apple server to achieve the “Verified” status. Would mean some extra hurdles though, and Apple are probably more interested in increasing the user base and enterprise deployments than investing in extra security at this point in time. (I’m not holding my breath for iPhone OS 4, but it will eventually arrive and I’m hoping they’re doing more on the enterprise side in this version.)

    Apple recycling the same keyid for two certificates? I certainly consider that a flaw, and it makes me wonder exactly how the boys in Cupertino are handling certificates. But maybe they have a really supergood reason for doing so for all I know…

    Well, ok, maybe my opinion was more than your average two cents, but it turned out longer than I thought. (If the value of it was more or less than two cents I leave for others to decide.)


    2010-February-2 at 21:14

    • Sorry Andreas, you got OMA DM wrong. Let me just clarify this.
      The bootstrap for OMA DM is typically sent over OMA CP which can require the user to enter a PIN. This PIN is specified by the server and has nothing to do with a thumbprint of a SSL certificate.

      What you mean is the creation of a trusted relationship as a Nokia proprietary extension to OMA DM on some E-Series phones. This is only needed for getting privileges to do advanced operations in their OMA DM tree. Again, this is Nokia specific.

      Security for OMA DM in general can be achieved by SSL and DM built-in HMAC authentication. The most secure way for the bootstrap via OMA CP would be to sign it not only by the user PIN, but also by the network PIN (IMSI).

      Michael Diener

      2010-February-8 at 15:12

      • You’re right Michael. As my wording stands, it is incorrect concerning bootstrapping OMA DM. It was however not the intention to make this seem like the de facto standard implementation. I attribute it to being slightly inprecise in my choice of words 🙂 The thumbprint is specific to enabling TARM which as far as I know is only present on the E-series from Nokia. But of the major smartphone platforms out there Nokia is the only one “championing” OMA DM as the main device management mechanism. Android had OMA DM slated for the 2.0 release in a draft, (if I remember the versioning correctly), but it was pulled out later in the feature vetting process. The iPhone has no support presently, and although I have heard rumours that it might be implemented I haven’t seen it surface yet. Windows Mobile supports OMA DM, and Microsoft themselves use it in System Center Mobile Device Manager, but the bootstrap process is always initiated client side so it works out a bit differently. And with the current state regarding all the unknowns of WM 7 who knows if OMA DM will be included in future releases. It doesn’t really add all that much functionality either compared to a native dm client. OMA CP is more widely supported, but I guess we can call that a different ball game for the sake of this discussion. The N-series from Nokia have in my opinion been intentionally crippled by Nokia with TARM not being included.

        The problem with Nokia devices is the fact that some of the more important OMA DM settings in a security perspective, like power-on-password and enabling encryption, can only be done when TARM is enabled. (I have not researched the details, so do correct me if I am inaccurate.)

        Using network pin on the bootstrap message is a good security measure, but the problem is that the IMSI isn’t always accessible enough for practical use if you’re not a mobile operator. I sure can’t tell the IMSI of my SIM card without looking at the piece of paper the SIM came glued onto (thrown away long time ago) or querying programmatically. Maybe there’s some *#xyz# combo to find it more easily – I don’t know.


        2010-February-9 at 23:06

      • Andreas, just some corrections though this discussion should be focused on the iPhone and not OMA DM. All major manufacturers release most of their phones with OMA DM clients as network operators want them to. The feature set of these varies quite a bit and yes, Nokia E-Series contains a lot of them. But also Windows Mobile or Sony Ericsson have rich clients. Btw, Windows Mobile has in fact security related features ( ). Also it should be considered, that some DM clients can be extended by 3rd party applications like on Nokia Series 60 or UIQ. One more point is that there are 3rd party DM clients out there that add support for rich feature sets of OMA DM on a lot of mobile OS which come with a native client or not.
        So, to conclude, if people want to use security related features, there is a market for OMA DM out there. Maybe not for the iPhone, but for almost all other vendors.

        Michael Diener

        2010-February-10 at 09:47

  2. Andreas: thanks for taking the time to respond. I would just like to add a couple of things here.

    This mobileconfig flaw basically ruins the trust you can place in the provisioning protocol described in iPhone OS v2. The protocol described in iPhone OS v3 may be vulnerable to some of the same issues though it makes use of SCEP to request root and personal certificates, which is not a bad choice and might solve the issue on inserting root CAs.

    You mention Microsoft’s implementation of SCEP in Windows Server 2008: you may be interested in having a look at the following “improvements”:

    First improvement: instead of having to deal with a one-time password for each SCEP request, make it a one-time-for-all password so everyone gets the same and it never expires. This reduces to ashes any security for a number of workflows.

    Second improvement: you can now set whatever you want for SubjectName upon certificate renewal. Ouch.

    SCEP also does not handle anything related to key management (import/export), so Apple will need to find something for that.

    How should Apple handle OTA updates? It would be presomptuous to venture a solution in five minutes but they may want to have a look at how Mobile Network Operators solved this chicken-and-egg issue with their SIM cards.


    2010-February-2 at 23:03

  3. Your opening paragraph appears to suggest that these flaws would be acceptable in a “consumer market product,” and only become unacceptable for “an enterprise device.” Surely you didn’t mean to suggest that?


    2010-February-2 at 23:37

    • Agreed: security flaws are unacceptable for everyone!

      This proof-of-concept is taking advantage of a security issue in the way iPhones handle Over-The-Air updates, which is most specifically an enterprise-oriented feature (as opposed to consumer market). While we all certainly appreciate security in the consumer market, it is clear that IT departments in charge of handling large number of corporate smartphones have a lot of work to do to maintain an acceptable security level on their network. It is up to smartphone vendors to offer companies the necessary tools.


      2010-February-2 at 23:53

  4. I haven’t tested the iPhone v2 implementation, but SCEP in iPhone v3 obviously does not solve anything if the flaw lies in the design of the provisioning feature itself. (And as I pointed out all the hardening in the world server side isn’t going to fix the client side.)

    I am aware of the so-called “improvements” from Microsoft, and have not applied these to my CA. However as the iPhone happily ignores OTP entirely I guess it doesn’t matter. (Yes, I had to make the active choice of disabling OTP so I’m not without blame myself. It’s in a lab though, so I can live with it at the moment.)

    SCEP was an interesting choice from Apple, but I’m sure it’s not those kinds of scenarios Cisco had in mind when writing the first draft.

    Microsoft seem to be down-playing SCEP support on their side, and instead focusing on the new web service interface in 2008 R2. This is purely an interpretation on my part, and I don’t know what Microsoft officially would recommend. The main drawback obviously being that it’s Microsoft-specific, and will also require upgrading existing Microsoft PKI infrastructures. Certificates seem to be very important in most server products from MSFT currently though (Exchange, OCS, etc) so maybe they’ll set a “new gold standard” with WM 7 (whenever that is released for enterprise use). Unfortunately the resigned sarcasm in that last sentence was intentional…

    I know It’s not mandatory to use MSFT’s SCEP implemenation, but the Apple weaknesses would be just as present with other implementations at any rate.

    I can see solutions like having a trusted root CA certificate embedded in the SIM card, and policies having to chain off the same root. It would however be tightly integrated with the operator in that way.

    I can also see something like provisioning having to be user-initiated client side and doing something like the ActiveSync wizard where your email address directs you to the correct server through autodiscover and authenticate with AD credentials or something.

    But admittedly these solutions aren’t perfect either. And there’s no way around it – Apple needs to actually think a little harder about security.


    2010-February-2 at 23:51

    • Excellent way to put it, thanks! There is no obvious solution but it does not prevent from working harder on the topic.


      2010-February-3 at 00:04

  5. […] de explotar esas debilidades y entrar remotamente a estos dispositivos y hacerle cambios, según un blog anónimo. Charlie Miller, un investigador de seguridad de productos de Apple que trabaja para Independent […]

  6. Where are you able to set the “Web proxy name and port” in and iPCU profile? I don’t see this option in the utility. As far as I know you cannot force all network traffic from an iPhone through a central proxy. Please clarify this step in your process.


    2010-February-3 at 02:35

    • Last entry in iPCU: Configuration Profiles > Advanced > Proxy Server and Port.

      It probably has a different meaning on an iPod touch and an iPhone, I have not tried.


      2010-February-3 at 10:39

      • That is a proxy server for the custom APN, not a generic device proxy, no? Have you tested this?


        2010-February-3 at 13:31

  7. […] SSL connections to major Web sites, res… 3 Likes Lifehacker 2 Likes iPhone certificate flaws « Cryptopath 2 Likes Twitter users asked to reset their passwords after phishing attack […]

  8. Does the 3.1.3 update from yesterday target this ‘problem’ in any way?


    2010-February-3 at 16:18

    • Does not look like it, no.


      2010-February-3 at 16:47

      • It is neither in the release notes. Thanks.


        2010-February-3 at 16:49

  9. This is a such important contribute, especially now that iPad uses the same iPhone’s firmware. Let’s think how many iPad’s attacks could be done using this technique ! Maybe this vulnerability would delay the iPad coming ?

    Anyway, thank you for sharing in such detailed post your work !

    Marco Ramilli

    2010-February-3 at 20:11

  10. […] darauf hingewiesen, von wem die Datei stammt und ob der Quelle vertraut wird. Ein anonymer Hacker berichtet nun, dass er diese Meldung manipulieren kann, so dass es für den Anwender aussieht, als würde er […]

  11. […] spoke with a mobile security expert who discovered the problem (who asked to remain anonymous because he did not have approval to talk about the issue). […]

  12. […] has been an exploit discovered in the mobile version of Safari on the iPhone that will allow a malicious “… that could authorize the routing of the HTTP (Web page) traffic routed to another server on the […]

  13. […] and Digital Certificates Here’s a post from a week or so ago that I haven’t flagged previously. Cryptopath is discussing […]

Comments are closed.

%d bloggers like this: