Combined with facial recognition, the mode is especially useful for portraits - hence the name. A selective blur can then be applied to areas of the frame, and the amount of blur can even decrease with distance for a more realistic effect. Essentially, with some AI assistance, the iPhone can tell which objects are in the foreground and which are in the background. Portrait Mode looks at the differences between the two images captured by both cameras and uses that information to determine the depth within the photograph - much in the way your two eyes help you determine depth in the real world. In addition to two unique angles of view, the iPhone 7 Plus introduced users to Portrait Mode, which used computational photography to create a faux shallow depth-of-field effect - where the subject is in focus and the background is blurry. The iPhone 7 Plus is the first iPhone to offer two camera modules: a standard wide-angle, plus a telephoto lens. Beyond offering insight into the magic of how Portrait Mode works, we discovered in our Anamorphic app review that it opens new creative doors for iPhone photographers. The app is currently in beta (along with iOS 11 itself), and Digital Trends has been testing it. #Anamorphic pro review softwareAnamorphic, a new iOS app from visual effects software developer BrainFeverMedia, is one of the first to take advantage of this feature. One of the many new features Apple is rolling out with iOS 11 is the ability for third-party apps to make use of depth data gathered by the dual cameras on the iPhone 7 Plus (and, presumably, on the upcoming iPhone 8 and iPhone X).
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |