Apple was able to pull off some unexpected surprises with the iPhone 14 Pro. Although there had been a lot of credible rumors about the company switching from putting the front-facing camera and Face ID system in a pill-shaped cutout instead of the familiar notch, the brand-new “Dynamic Island” alert system appeared out of nowhere. And while it was becoming increasingly apparent that Apple would eventually have to follow the industry in using larger camera sensors, Apple went even further and rebooted its entire computational photography system as the Photonic Engine. This was done while it was still clear that Apple would eventually have to follow the industry.
The iPhone 14 Pro includes a significant quantity of components of this kind, despite the fact that its base price in the United States is still $999 and increases from there. Although Apple was one of the last companies to release a device with an always-on display, this display has a considerably more vibrant color scheme than other always-on displays. Apple is going all in on eSIM in the United States, which is something that no other company is actually doing. Apple is intending to send millions of these phones before the service goes live later this year, despite the fact that there is a rudimentary satellite connectivity system that isn’t quite like anything else we’ve heard about. In general, the new iPhone 14 Pro contains more seedlings of potentially game-changing ideas than any other iPhone we’ve seen in a very long time, and that’s saying a lot.
That’s probably the simplest way to think about the iPhone 14 Pro: it feels like the first step toward a lot of new things for Apple and the iPhone, and maybe even the first peek of an altogether new sort of iPhone. However, this does not imply that any of these things have reached their final, ideal state.
I will be the first to confess that the moniker “Dynamic Island” has grown on me. After all, it has had everyone talking about it, which is not something that typically happens with a smartphone’s status indication system. If Apple wants to force everyone to give serious consideration to supplemental ideas for smartphone interfaces, then I am all for that.
Since both the front camera and the Face ID system need some room on the front of the display, they have been relocated to the island, which takes the place of Apple’s infamous and much criticized notch. The island houses both components. However, here’s the catch with the notch: after you’ve used it for a few minutes, it will almost completely fade away.
The island is distinctive, and you are expected to recognize this fact. It is positioned further down on the screen than the notch, and it is a high-contrast interface element that appears as a black pill shape in the middle of a screen that is white while the phone is being used in light mode. You are going to notice it, especially considering the fact that it is animated and constantly moving around. In dark mode, it blends in more naturally; in fact, I would go so far as to say that this is the first iPhone that feels like it performs better in dark mode as a direct result of this feature.
Why, then, would Apple transform the discrete notch into an island that is slightly more noticeable? In the course of its development over the years, iOS has received a number of updates that introduce new status indicator systems. An overlay appears whenever you do things like plug in a charger or toggle the mute switch. There will be a green pill in the corner if you are currently on a call in the background, and there will be a blue pill if an application is currently accessing your location. On the opposite side of the pill are icons that indicate screen recording and personal hotspot use. Another layer is added when you connect your AirPods. And certain things, like timers and music that is playing in the background, have never really had any kind of relevant status indications at all.
Apple is going to replace and unify all of those older status systems with a new home for system alerts, and they are going to make it work for things like music and the new live activities API that is coming later this year with iOS 16. This API will allow apps to share even more background information for things like your flight status or a sports score. The island is Apple’s way of accomplishing these goals. It is also Apple’s way of making it work for things like music. It is not a replacement for notifications; those will continue to show up in the notification center, where they will appear in essentially the same way as before.
The island can be understood in its most basic form by noting that it is essentially a new widget system that was built on that live activities API. Each widget can have three different views: the main view in the island, an expanded view, and an ultra-minimal icon when you have more than one thing happening at the same time. If you have more than two activities going on at the same time, Apple has an internal priority list that will help you determine which two things are the most important.
It’s a very cool idea, but as with all first versions of anything, Apple made certain decisions that turned out to be excellent, while others… well, it’s the first edition.
The island is intended to feel more like hardware than software; it is almost like a secondary display that can get bigger or smaller. This is a significant decision that works out thanks to Apple’s decision to overcomplicate things in the usual Apple approach. Apple has developed a unique dynamic subpixel antialiasing technology in order to make this feel more natural. This system makes the edges of the island up to three times crisper than they are in any of the other animations in iOS, which antialias at the pixel level. This works incredibly well when the lighting in the room is normal; it gives the impression that the cutout on the display is gradually becoming larger and smaller, and the animations are a lot of fun. (When exposed to sunlight or other sources of brighter light, the camera sensors become visible, and the illusion is destroyed; however, the effect is still quite impressive.)
Moving all of those various status markers to the island and giving them significance so that they are worth paying attention to is actually a fantastic idea, which is the second significant thing that works. It is convenient to have call information displayed directly on the screen. It is quite helpful to have your timers in such a convenient location. The implementation of features such as AirDrop and Face ID in a manner that is similar across all platforms and in the same location makes these features simpler to comprehend, which is a positive development.
Here is where I believe Apple fell a little short of its goal: in the keynote and all of the advertisements, the island is shown as a thing that it is worth interacting with; it is always moving around and going back and forth between the primary view and the expanded view. In point of fact, though, it is nothing at all like that at all.
The island is not a key interface feature; rather, it appears on top of the app that you are currently utilizing, and apps are still the primary focus of the iPhone. In point of fact, touching on the island does not open the expanded widget view; rather, it takes you back to the app that is responsible for controlling the widget. You will need to tap and hold to obtain the larger widget that is displayed in all of the advertisements. It seems to me that this should be done in the other order. I believe that the widget should pop open when a user taps on it, and I also believe that you should at the very least be able to choose which of the two behaviors to use.
It is far more noticeable and valuable than the notch, yet you are not really allowed to engage with this thing because it is background information. This is the primary tension of the island. You have your music playing, your personal hotspot is operational, and you have plugged in a charger; none of these are things that actually require your attention right now. I was asked quite a few times whether or not fingerprints will cause issues with the front-facing camera, and although it doesn’t appear to be a problem, it’s even less of a concern because you don’t touch this thing very much at all.
Before we can truly evaluate the significance of the Dynamic Island as a feature, it is one of those things that requires a year of further development and attention from the developers.
However, despite this, you continue to constantly glance at the island since the animations draw your attention to it in a much more obvious way than before. Because it is positioned lower on the display, it has the potential to obscure some material in applications that have not been updated. Therefore, at this very moment, there is a bit of an imbalance in the trade-off between how noticeable the island is and how valuable it is; the island does not quite do enough to continually be in the way.
Having said all of that, this tradeoff may undergo a complete transformation when the live activities API becomes available later on in this year. Apple developed the hooks to make this whole thing available to third-party developers, and some of the ideas that we’ve heard from Lyft and Flighty and other companies are really intriguing. This is the other major thing that Apple did correctly, and it’s a significant deal. However, as of right now, the Dynamic Island seems to be one of those things that requires a year of more development and the attention of the developers before we can truly understand how significant it is.
The new main camera sensor in the iPhone 14 Pro has a resolution of 48 megapixels, and this is the most notable improvement to the camera system. Apple is a few years behind the curve when it comes to adopting this trend; Samsung has been using sensors with 108 megapixels ever since the release of the S20 Ultra in 2020, while Google included a 50 megapixel sensor in the Pixel 6 Pro only last year. The ultrawide and 3x telephoto cameras have also been updated by Apple, although their resolution has been maintained at the regular 12 megapixels. The most notable improvement, however, is the new primary sensor.
It doesn’t matter where you go, the fundamental principle stays the same: in order to capture better images, you need to gather as much light as you can, and in order to do so, you need pixels that are larger. But at some point, getting the pixels physically larger becomes difficult, so instead, you add a lot more physical pixels on a gigantic sensor and use software to aggregate them into giant virtual pixels. This is done in order to circumvent the difficulty of making the pixels physically larger. The practice is known as “pixel binning,” and the mathematics behind Apple’s binning is simple: it combines four physical pixels into a single “virtual” pixel, which means that the 14 Pro’s 48-megapixel sensor only snaps images with a resolution of 12 megapixels on average.
Images captured by the Pro’s 48-megapixel sensor are typically 12 megapixels in resolution.
Apple is running its Deep Fusion processing for mid- and low-light photos earlier in the process on uncompressed image data. This is supposed to improve low-light performance by two to three times depending on the camera you’re shooting with. The other major change to the camera system is that Apple is using a new sensor with a larger pixel size. Because of this modification, the entire image processing pipeline was renamed the “Photonic Engine.” Apple still does Smart HDR and all of its other well-known processing, but the new moniker sounds more impressive.
We have always referred to Deep Fusion as “sweater mode” due to the fact that Apple likes to demonstrate it with gloomy photographs of individuals wearing sweaters in low-key settings; yet, the results have always been very understated. In the case of the iPhone 14 Pro, the situation is exactly the same. It would appear that sweater mode continues to function properly even when applied to uncompressed data.
Both the 14 Pro and the 13 Pro take photographs that are extremely comparable to one another. You really have to look hard to notice a difference between the 14 Pro and the other models in terms of temperature and the amount of detail that is captured when set to 100 percent. This holds true for both the primary camera as well as the ultrawide camera, which, in addition to having a larger sensor, also benefits from the inclusion of Photonic Engine. You have to look very closely in really low light to notice that the details from the ultrawide look slightly better than those from the 13 Pro, but only if you use 100 percent.
It is the same in bright light: these pictures of Mariya Abdulkaf, a senior video producer at Verge, taken outside look pretty much the same. However, if you zoom in to 100 percent, you can see that the iPhone 14 Pro is getting a little more detail and has a nicer background blur because of the significantly larger sensor. It’s very good, but when viewed on Instagram, the difference isn’t really that visible. The Pixel 6 Pro’s pixel-binned 50-megapixel sensor is able to capture significantly more information than its predecessor, in addition to a wider color gamut.
This is about comparable to the degree to which the Google Pixel and the Apple iPhone have diverged over the past few years. Both phones are able to capture a lot of detail and perform exceptionally well in low light, but the Pixel 6 Pro makes very different choices about highlights and shadows, whereas the iPhone 14 Pro is much more willing to let highlights blow out and even more willing to let some vignetting creep in. Both phones have great low-light performance. I honestly do not know which option would be considered “better.” Both of these photographs taken in night mode are excellent; hence, your choice of which one you like more is entirely dependent on personal taste.
The specifics of the processing are where the iPhone 14 Pro really struggles to compete when compared to its competitors: Apple has been steadily increasing the amount of noise reduction and sharpening it performs over the course of the past few years, and the 14 Pro features Apple’s most robust noise reduction and sharpening to date. Sometimes it just looks bad: this night skyline image appears like a jumbled mess after being overprocessed in comparison to the Pixel.
In contrast to the Samsung S22 Ultra, the iPhone is not as reliable in its performance. The S22 Ultra consistently retains more color detail even when the light levels are low, and it sharpens and reduces noise in a less aggressive manner than its predecessor. When exposed to bright light, the distinctions between the 14 Pro and the S22 Ultra become less pronounced; yet, Samsung continues to perform a better job with detail. In typical Samsung fashion, the colors are far more vibrant and cozy in comparison to the more natural look of the iPhone; Samsung’s color ideas are occasionally from a whole different planet. To compare apples to apples, however, the S22 Ultra produces more consistent results with improved fine detail.
Having a large sensor with a large number of pixels offers up other options: in addition to pixel binning, Apple is also cropping it to achieve what it claims is a “optical grade” 2x magnification. Having a large sensor with a large number of pixels opens up other possibilities. If you shoot in ProRAW mode at the full 48 megapixels and simply crop off the center of the image, you’ll end up with the same picture as if you just took the central 12 megapixels out of that 48-megapixel sensor. This is a significant improvement, hardware-wise, over the 2x telephoto lens that was included in the iPhone 12 Pro that was released two years ago; nevertheless, because you do not receive the benefit of pixel binning, it can be difficult to use in environments with less available light. But having it is a plus, and it provides a good compromise between the standard wide and the 3x telephoto lenses.
This 2x cut is also the default setting for the portrait mode, which does not appear to have been significantly improved. The S22 Ultra and even the standard S22 produce photographs with superior portrait quality. The 14 Pro isn’t quite there yet, but Samsung has mastered separating the subject from the background down to individual strands of hair, and the 14 Pro can’t compete with that. In the midst of the brightest light possible, it went ahead and just severed a portion of Mariya’s head.
I did end up taking a couple pictures in ProRAW at 48 megapixels, and each of them has a great deal of detail as well as enough of area for editing. Playing around with the iPhone 14 Pro will provide you with a never-ending supply of entertainment if you are the type of person who is enthusiastic about ProRAW on an iPhone. However, I do not believe that regular people should shoot photographs at a resolution of 48 mega pixels on their phones.
When compared to the 13 Pro for several ordinary selfies, the overall differences were so slight that they were scarcely visible. Apple has introduced focusing to the selfie camera, which is presumably useful in some instances.
Becca Farsace, a senior video producer at Verge, was the one I asked to test out the video features of the iPhone 14 Pro. She reported that they appear just as amazing as they always have, but that there is not a significant improvement over the already superb iPhone 13 Pro. You can watch more in the video that is located above, but here are the most important points:
Cinematic Mode on the 13 Pro was more than just a little bit clumsy the previous year, but Apple is continuing to put more work into it, and on the 14 Pro, it does a better job of differentiating faces from the background so that it can apply blur. Additionally, it is now compatible with video resolutions of up to 4K. Becca put it through a number of tests, and the results showed that it performed the best with human faces, but it was less successful with other types of subjects.
There is now support for 4K resolution in the cinematic mode.
The other major new video feature that was released this year is called “Action mode,” and it is a stabilization technology that allows you to film without using accessories such as a gimbal while still producing footage that is smooth and steady. But it comes with some substantial limitations, such as the fact that you need a lot of light for it to work, and the collected film has a large aspect ratio reduction (the highest resolution it can achieve is 2.8K, not 4K). Although it is entertaining to experiment with, this is again another feature that does not appear to be practical for at least another year.
Becca says that it is very difficult to tell the difference between the footage shot with the 14 Pro and the footage shot with the 13 Pro when the lighting conditions are optimal; however, when the lighting conditions are not optimal, the telephoto lens on the 14 Pro produces an image that is noticeably crisper and has less noise.
The iPhone has been the most formidable competitor in the market for smartphone video for a number of years, and the 14 Pro ensures that it continues to hold that position. You should really check out the video for additional information; trying to write about the aspects of the film is kind of like trying to dance about architecture, you know what I mean?
An always-on display feature was added to the iPhone 14 Pro by Apple at long last. This is despite the fact that Android phones have had always-on displays for a considerable amount of time now. It’s all good! In order to preserve the life of the battery, the display’s refresh rate is lowered to just one hertz, and the brightness is reduced to an incredibly dim level. Apple has done some nice work to keep wallpaper colors accurate in the low-power always-on mode, but honestly, I would rather have a black and white clock in the style of the Pixel than something that sort of looks like my phone is awake all the time. Apple has done some nice work to keep wallpaper colors accurate in the low-power always-on mode. In the future, I really hope that there will be some possibilities for customization available here.
Becca and I put the iPhone 14 Pro Max through its paces, while Allison Johnson is the proud owner of the iPhone 14 Pro. And despite the fact that the battery life lasted all day, the three of us had the impression that the battery drained a little bit quicker than it had in the past. To be fair, all three of us have been going crazy over the past week testing these smartphones by taking a lot of images and movies and basically putting them through their paces, but, you know, we test a lot of smartphones in this manner. Even though we were able to get through an entire day with the 14 Pro Max, Apple claims that the 14 Pro and 14 Pro Max will get slightly better battery life than the 13 Pros. Since we were all able to get through the day with the 14 Pro Max, this suggests that the always-on display may have been the cause of the decreased battery life. In any case, this is something that we will be keeping a close eye on as time goes on.
Aside from that, the display is now a touch brighter than it was in the before. It is now capable of reaching a peak brightness of 1,600 nits when displaying HDR material, which is an increase from the previous 1,200 nits, and it can reach 2,000 nits when exposed to direct sunlight. In addition to that, it keeps the 120Hz ProMotion functionality that was present on the 13 Pro enabling scrolling and interactions that are more fluid. It’s been my opinion for a long time that Apple’s mobile screens are invariably the best there are on the market, and this is not an exception.
Apple took another surprise move when it announced that it would remove SIM card trays from iPhones sold in the United States. This indicates that it is high time for everyone to become accustomed to eSIM, which provides access to mobile networks without the need for a physical SIM card. At least eight separate eSIMs can be stored on the 14 Pro, which is a rather impressive number considering that only two of them can be used at the same time. During my testing, it performed quite well: my AT&T account was successfully transferred from the actual SIM card in my iPhone 13 Pro directly via Bluetooth, and I was able to connect my Google Fi account from the web with just a few taps.
Now, it is not nearly as easy to transfer eSIM information from iPhones to Android phones, and carriers are certain to play some weirdo lock-in games here because they are carriers, and playing weirdo lock-in games is just what they do. However, it is not as easy to transfer eSIM information from Android phones to iPhones. It is also possible that this will cause problems for tourists who are accustomed to purchasing local SIM cards in the countries that they are visiting. And I was hoping that Apple would allow users of the iPhone to scan for available networks and sign up right from the phone like you can do on an iPad, but it appears that this is not an option — you need an eSIM activation kit, which typically involves scanning a QR code. Moreover, I was hoping that Apple would allow iPad users to scan for available networks and sign up right from the iPad.
But the fact that you can add new networks to your phone in a relatively short amount of time and with no effort also means that, in theory, we can all put greater pressure on the carriers to compete with one another, which is unquestionably a positive development.
Allison was able to get a sneak peek at Apple’s emergency satellite connectivity system on the Apple campus, and it appeared to be very well designed and functional in the demonstration that she watched. The software will first instruct you to make an emergency call through cellular, and if that attempt is unsuccessful, it will proceed to use the satellite connection. The system will lead you through a series of questions that first responders can use to better comprehend your predicament. After that, the user interface will show you where to aim the phone in order to connect with a satellite. You’ll even be able to watch the satellite icon on the screen shift around to reflect the actual location of the satellite as it travels across the sky. Even though there was some vegetation in the way of the connection between the phone and the satellite, messages were nevertheless transmitted in fewer than thirty seconds during the demonstration, which was conducted under the most controlled of controlled conditions.
One of those features, Satellite SOS, is something that you are either interested in or not interested in at all. If you enjoy being outside, this could be something that could be intriguing to you as a means of achieving mental clarity. But if you never leave areas that have cell phone coverage, you probably don’t need a satellite emergency communication system because you won’t ever need it. In any scenario, it is highly likely that it will be an additional cost at some time in the future — Apple will not disclose how much it will cost, but it will be free for the first two years on the iPhone 14. Additionally, it will not be accessible immediately away; rather, an update containing it will be released in the month of November.
Apple has not disclosed the cost of continuing to use Satellite SOS after the initial two years of free service.
Crash Detection is another another option for dealing with dreadful situations that may arise. It looks to function very similarly to a feature that is available on Google’s Pixel phones and leverages the information gathered from a number of the phone’s sensors to determine whether or not you have been in a car accident. As the iPhone 14, 14 Pro, and newer variants of the Apple Watch are equipped with an unique accelerometer that helps allow the feature, you should not anticipate it to be added to previous models of the iPhone in the near future. Crash Detection, in contrast to SOS via satellite, does not call for any action or input from the user. It will display a message asking you to call emergency services or dismiss the notification if it detects an accident. You can choose either option. In the event that you do not answer within the allotted time of twenty seconds, it will automatically call for assistance. We haven’t yet worked out how to intentionally wreck an automobile in order to put it through its paces, but we do have some ideas. It is wonderful to have even if you never have to use it because it is free and does not require any configuration on your part. This is assuming that it functions as it should.
Regarding the 14 Pro and the 14 Pro Max, that covers pretty much everything important to know. There is a little bit more to discuss, but the vast majority of it does not alter the fundamental iPhone equation: the 14 Pro is the only model this year with Apple’s latest A16 Bionic silicon, which is more potent and has a faster GPU. This is the case because it is the only model with Apple’s latest A16 Bionic silicon. It is quick, to be sure, but the 13 Pro and the 12 Pro that came before it were also quick, so it is difficult to measure the difference here. The advantage that Apple has in terms of performance continues to be best stated as longevity: these smartphones are so quick that they won’t seem slow for many years to come.
There are a total of four color options for the Pro, including the recently introduced purple and darker black variants. Both the purple and the black version have garnered a lot of positive feedback from customers. But, on the other hand, almost everyone protects their smartphone with a case anyway. In point of fact, the Max is so large that it is nearly impossible to employ without a case that provides some sort of hold on the circumstance.
My perspective on Apple’s current iPhone range is that the iPhone 13 Pro was the conclusion of a lot of concepts for Apple; it was confident and complete, and it was kind of difficult to critique. This is how I’ve been thinking about the iPhone lineup. The iPhone 14 is essentially a slightly updated version of the iPhone 13, and in many respects, it has the same sensation of being the fruition of ideas.
On the other hand, the iPhone 14 Pro marks the unmistakable beginning of a great deal of innovative concepts, such as the Dynamic Island, the new camera, and that satellite communication system. These are fresh thoughts, and fresh ideas by their very nature are never fully formed. However, they are deserving of criticism, which is a triumph in and of itself as well as evidence that Apple is not standing pat with regard to the iPhone’s foreseeable future. I believe that we could all benefit from thinking more deeply about the inner workings of our smartphones, and the fact that Apple is still thinking deeply about certain aspects of the iPhone experience is evidenced by features such as the Dynamic Island.
The 14 Pro marks the obvious beginning of a great many novel concepts.
What I do not know is whether or not any of these fresh concepts are yet worth pursuing. You’re going to have a lot of fun with the iPhone 14 Pro if you’re the type of person who is ready to accept some rough edges in order to be on the cutting edge. In many respects, you’ll be working it out right along with Apple, which is a very cool thing to be able to do. But if you’re content with the phone you have now, it might be worth waiting another year to see how some of these new features pan out.