iPhone 14 Pro vs Google Pixel 7 Pro – Cameras Compared

[ad_1]

November 11, 2022

iPhone 14 Pro vs Google Pixel 7 Pro: Which smartphone is best for photography?

Apple is obviously a huge name in the smartphone market, but in recent years, the Google Pixel range has garnered a great reputation for photography. But which is better? In this head-to-head we’ll be aiming to find out.

Both of the phones are capable of taking excellent photos, and both are two of the best smartphones for photographers you can buy.

The latest Pixel 7 Pro model is its most advanced model yet, offering a triple lens setup which is in many ways similar to the iPhone 14 Pro. In this piece we’ll be aiming to find out which of the two shapes up the best.

iPhone 14 Pro vs Google Pixel 7 Pro: Camera specs

Both the iPhone 14 Pro and the Google Pixel 7 Pro have a triple lens setup – it’s the second time that the Pixel has included a third lens in its array, with the 6 Pro being the first Pixel with a triple lens setup.

That means you get a standard wide-angle (1x) lens for both models, joined by an ultra-wide-angle lens (0.5x) and a zoom lens (3x for the iPhone, 5x for the Pixel).

Behind the main lens for the iPhone is a 48 megapixel sensor, while the Pixel has a slightly higher resolution sensor at 50 megapixels.

The ultra-wide lenses on both models are paired with 12 megapixel sensors, with auto-focus (AF).

For the zoom lenses, you get a 48 megapixel sensor for the Pixel and a 5x telephoto reach, and just a 12MP device for the iPhone, with a 3x telephoto reach.

iPhone 14 Pro vs Google Pixel 7 Pro, photo: Amy Davies

iPhone 14 Pro vs Google Pixel 7 Pro, photo: Amy Davies

In terms of other lens specifications, the iPhone’s wide-angle lens is f/1.78 and features optical image stabilisation (OIS), the ultra wide is f/2.2, while the zoom lens is f/2.8 and again has optical image stabilisation.

For the Pixel, the wide angle lens is f/1.85 and it has both optical image stabilisation and electronic image stabilisation (EIS). The ultra wide is f/2.2, and the zoom lens is f/3.5 and again has both OIS and EIS.

Other camera-related specifications include 4K video at up to 60fps, slow-mo shooting and shallow depth of field in video effects – across both models.

iPhone 14 Pro vs Google Pixel 7 Pro: Camera apps and shooting modes

Both the iPhone and Pixel have relatively fuss-free native camera apps. On the one hand, that’s good news if you want to concentrate on composition, but the simplicity is frustrating for advanced photographers who want to make changes to settings. Neither phone offers an “advanced” mode, though you can make settings to certain changes, and you can shoot in raw format with both models.

Low-light shooting is available through “Night Sight” mode on the Pixel, which is a selectable mode, while on the iPhone you get Night mode, but it only appears when the phone detects that the light is low. The Pixel also has an Astrophotography mode which can be used with a tripod or steady support.

As is pretty common for most smartphones, both have a Portrait mode which means you can create shallow depth-of-field effects with a range of subjects – not just humans.

iPhone 14 Pro vs Google Pixel 7 Pro, photo: Amy Davies

iPhone 14 Pro vs Google Pixel 7 Pro, photo: Amy Davies

New for the Pixel is a macro mode, which will automatically activate if the phone detects that it is close to a subject. You can choose to switch it off if you like at this point. The iPhone 14 Pro is the second iPhone to have macro mode, following on from the 13 Pro. It works in the same way as the Pixel’s – automatically activating in the right conditions, using the ultra-wide camera and cropping into the frame.

Both phones include video recording modes, with the iPhone having a separate “Cinematic” mode for creating shallow depth of field effects. The Pixel has a similar mode, which is also called “Cinematic”.

One mode that the Pixel has that the iPhone doesn’t is “Motion”, which allows you to create long exposures and panning shots directly in camera.

iPhone 14 Pro vs Google Pixel 7 Pro: General image quality

Comparing “general” images from both phones, taken in good light and across a variety of subjects reveals that image quality between the two is very similar. Looking at images on their phone screens gives the best view, but even looking at them on a computer screen shows that the two are very well matched.

iPhone 14 Pro - General flower photo

iPhone 14 Pro – General flower photo

Google Pixel 7 Pro - General flower photo

Google Pixel 7 Pro – General flower photo

If you’re someone who takes most of their smartphone photos in these conditions – i.e. day-to-day shots, then the Pixel 7 Pro certainly offers the better value for money for more or less the same output. If we’re being really picky, the 7 Pro seems to produce images which have a slightly more “HDR”-type effect which isn’t always attractive depending on your personal preferences. The Pixel 7 Pro also seems to produce slightly warmer results – which look attractive, but the iPhone’s shots are more realistic to the scene.

iPhone 14 Pro general photo, using the standard (wide) camera

iPhone 14 Pro general photo, using the standard (wide) camera on a grey day

Pixel 7 Pro general photo, using the standard (wide) camera

Pixel 7 Pro general photo, using the standard (wide) camera on a grey day

With the Pixel 7 Pro, you get a more flexible zoom, with up to 5x optical zoom available. Images taken with the Pixel 7 Pro’s 5x zoom (117mm equivalent) roughly match in quality the iPhone 14 Pro’s 3x offering (77mm) albeit without the same reach. Therefore, if you’re keen to have more zoom power, the Pixel 7 Pro is probably the better option.

iPhone 14 Pro, 3x telephoto zoom

iPhone 14 Pro, 3x telephoto zoom

Pixel 7 Pro, 5x telephoto shot, taken from the same spot

Pixel 7 Pro, 5x telephoto shot, taken from the same spot

iPhone 14 Pro vs Google Pixel 7 Pro: Low light

Both of these smartphones have a dedicated low-light / night mode. We tested it with a very low light scene, to really push the limits of each device. Both put in a good performance, doing particularly well when using the main / 1x lens.

iPhone 14 Pro - Night mode, 0.5x ultra-wide-angle camera

iPhone 14 Pro – Night mode, 0.5x ultra-wide-angle camera

Pixel 7 Pro - Night mode, 0.5x ultra-wide-angle camera

Pixel 7 Pro – Night mode, 0.5x ultra-wide-angle camera

If you switch to the ultra-wide lens, details become a little fuzzier, but they both put in similar performances. There’s perhaps slightly more detail visible in the iPhone shot, but there’s not a lot in it. The Pixel 7 Pro is also more susceptible to lens flare.

iPhone 14 Pro - Night mode, 1x camera

iPhone 14 Pro – Night mode, 1x camera

Pixel 7 Pro - Night mode, 1x camera

Pixel 7 Pro – Night mode, 1x camera

The “2x” lens on both devices actually uses a crop of the main sensor. Both put in a decent performance at this focal length, with the Pixel perhaps having a touch more detail.

Again, when light is very low, the telephoto night mode uses a crop of the main sensor, for both phones. The results at 3x (iPhone) or 5x (Pixel), are not particularly pleasing, and it’s probably only something you’d want to use sparingly. When light is a little better (but still low enough to use Night mode), you do get better results.

It’s worth noting that the Pixel 7 Pro (and other Pixel phones in the series) offers an Astrophotography mode designed to help you capture stars, with particularly long exposures possible when using the phone on a tripod or other stable surface.

Overall, the two are pretty evenly matched here, and it’s hard to recommend one wholly other the other.

iPhone 14 Pro vs Google Pixel 7 Pro: Macro

The macro modes for both phones automatically activate when you bring the device close to a subject.

iPhone 14 Pro - Macro shot

iPhone 14 Pro – Macro shot

Pixel 7 Pro - Macro shot

Pixel 7 Pro – Macro shot

Again here we can see that both phones put in a decent performance, with not an awful lot to separate out the two. If we examine very closely, we can see that there’s perhaps a tiny little bit more detail with the iPhone shot, but the Pixel’s shot is a little warmer. As this is a macro test, we’d probably prefer detail over warmth, but both are great for grabbing those close-up shots.

iPhone 14 Pro vs Google Pixel 7 Pro: Portrait

Shallow depth-of-field effects have been a common staple among smartphones for quite a while now.

iPhone 14 Pro, portrait mode, 1x

iPhone 14 Pro, portrait mode, 1x

Both models put in a good performance here, creating a decent separation from the background. The iPhone lets you take portraits with a wider view, whilst the Pixel 7 Pro starts with a more cropped image, matching the 2x option on the iPhone. Results from this are similar, although the Pixel 7 Pro seems to have the edge slightly, particularly in detail on the subjects face.

Pixel 7 Pro portrait with 1x selected

Pixel 7 Pro portrait with 1x selected

iPhone 14 Pro portrait, 2x selected

iPhone 14 Pro portrait, 2x selected

However, the iPhone’s result is much more attractive when using the 3x option as it uses the telephoto camera. This appears to be about equivalent to using the Pixel’s 2x option in terms of cropping, but not quality, as the Pixel 7 Pro uses digital zoom for this.

Pixel 7 Pro portrait, 2x selected

Pixel 7 Pro portrait, 2x selected (this uses digital cropping)

iPhone 14 Pro - portrait, 3x selected

iPhone 14 Pro – portrait, 3x selected (using the telephoto camera)

With the iPhone, you also get the choice between 1x, 2x and 3x, making it a more flexible option for including more of the background view, whereas the Pixel 7 Pro doesn’t let you use the 5x telephoto camera for portraits. Both work well with both human and non-human subjects.

iPhone 14 Pro vs Google Pixel 7 Pro: Video

We’ve got fairly similar video specifications between the two models here. Both offer up to 4K 60fps shooting, with no sign of the 8K found on models like the Samsung Galaxy S22 Ultra. The iPhone gives you the option to also use “ProRes” video recording, which is available in 4K – but only if you have a 256GB or above model (you can shoot in ProRes at Full HD if you have a 128GB model).

There’s also Cinematic mode for both devices, which creates shallow depth of field effects in video. The iPhone’s looks more natural than the Pixel’s (which is easily confused), and it’s likely this will improve with future updates.

Both offer stabilised video options, which are useful if you’re trying to create video will moving – such as while jogging. Both also have slow-mo video, and other functions such as Time-lapse.

iPhone 14 Pro vs Google Pixel 7 Pro: Screen and Design

With the iPhone you can opt for either the standard 14 Pro model (6.1-inch screen), or you can plump for the larger 14 Pro Max model (6.7-inch screen), which is the same size as the Pixel 7 Pro. The smaller of the two is better suited to those with smaller hands (or smaller pockets), and is also easier to use for day-to-day tasks like sending texts.

There’s no difference between the Pro and the Pro Max models when it comes to camera specifications.

iPhone 14 Pro vs Google Pixel 7 Pro, photo: Amy Davies

iPhone 14 Pro vs Google Pixel 7 Pro, photo: Amy Davies

Meanwhile, the Pixel 7 Pro is only available in that one size (unless you go for the smaller Pixel 7, which lacks some of the features of the 7 Pro). That’s ideal if you like a bigger screen, but perhaps less good news if you want something more compact. The Pixel 7 Pro has a higher resolution (3120 x 1440) than even the larger iPhone 14 Pro Max (2790 x 1290), but, if you place the two side by side, it’s not immediately obvious – they both look very good with vibrant and bright displays.

Although the Pixel 7 Pro has the same size screen as the iPhone 14 Pro Max, the actual phone itself is slightly larger, although it is very slightly narrower. Measuring up at 162.9 x 76.6 x 8.9mm, it compares against 160.7 x 77.6 x 7.85mm for the iPhone 14 Pro Max. Just to note that the standard iPhone 14 Pro measures up at 147.5 x 71.5 x 7.85mm.

The Pixel 7 Pro has a slightly sleeker look about it, with more rounded edges than the boxier appearance of the iPhone 14 Pro. Which you prefer is likely a matter of personal taste. Both phones are IP68 rated, which means that they can withstand dust and water. Both also have “tough” features like a Corning Gorilla Glass Victus Screen (Pixel 7 Pro) or a Ceramic Shield Screen (iPhone 14 Pro).

iPhone 14 Pro vs Google Pixel 7 Pro: Battery Life and Capacity

Annoyingly, Apple does not like to disclose its official battery specifications, but we can gauge its performance by both its quoted battery life and real-world testing. Meanwhile, Google says the Pixel 7 Pro has a 5000 mAh battery. Looking at battery life quotes alone, we can see from these things that both the Pixel and the iPhone have pretty similar capabilities.

Apple says that the iPhone is good for 23 hours, or 29 hours (video playback) if you opt for the larger 14 Pro Max. Meanwhile, the Pixel 7 Pro’s battery life is quoted at 24 hours for video playback. In real-world usage, we’ve found that both the phones last a full day of “normal” usage, which includes not only taking pictures and video, but of course browsing the web, using apps and making calls etc. The Pixel 7 Pro also has the option to enable “Extreme Battery Saver”, which means your phone can last up to 72 hours, but means you lose certain features.

iPhone 14 Pro vs Google Pixel 7 Pro, photo: Amy Davies

iPhone 14 Pro vs Google Pixel 7 Pro, photo: Amy Davies

Both of the models offer fast charging and wireless charging with compatible chargers. Neither provide a charging plug in the box, but both do come with USB cables.

Neither of these phones give you the option to expand your capacity once you’ve bought the phone. If you think you’re going to need a lot, you’ll need to pay for it at the point of purchase. The base amount of storage for both models is 128GB, which rises to 512GB for the Pixel 7 Pro, and 1TB for the iPhone 14 Pro. At the time of writing, it doesn’t seem like the 512GB version of the Pixel 7 Pro is available to buy in the UK.

iPhone 14 Pro vs Google Pixel 7 Pro: Price

One of the best things about the Pixel 7 Pro is that it’s available at a reasonable price, especially for a flagship model. UK pricing is £849 for the 128GB model, rising to £949 for the 256GB version.

If we compare that with the iPhone 14 Pro Max, which is £1,149 for the 128GB, or £1,309 for the 256GB, there’s quite a price disparity – and you don’t seem to get too much more for your money (apart from the Apple name). The smaller iPhone 14 Pro is a little cheaper, starting at £1,099 for the 128GB version, and £1,209 for the 256GB version, but it’s still a hefty chunk of change.

For those looking for a class-leading smartphone, but without the massive budget to back it up, the Pixel 7 Pro is certainly better for your finances.

iPhone 14 Pro vs Google Pixel 7 Pro, photo: Amy Davies

iPhone 14 Pro vs Google Pixel 7 Pro, photo: Amy Davies

iPhone 14 Pro vs Google Pixel 7 Pro: Verdict

There’s plenty of excellent features with both smartphones, and both produce great pictures and video in a variety of different situations.

If we had to choose one, then the iPhone 14 Pro probably edges it just slightly, giving better Portrait results, and slightly better Night time results. Then again, the Pixel has better zooming capabilities. It’s certainly true that the iPhone isn’t better enough to warrant the extra outlay – especially if you’re on a tight budget.

Overall, the two are pretty evenly matched in almost every respect. If you prefer iOS, then you might want to consider one of the older Apple models to save cash, but if you’re not fixed to Apple, then the Pixel 7 Pro is the smart choice.

Read our Google Pixel 7 Pro review
Read our iPhone 14 Pro review


iPhone 14 Pro vs Samsung Galaxy S22 Ultra
iPhone 14 Pro vs iPhone 13 Pro Compared

For even more options have a look at our guide to the best smartphones for photography.


Follow AP on Facebook, Twitter, Instagram, and YouTube. 

SubscribeSubscribe



[ad_2]

Canon RF 24-70mm f/2.8L IS USM lens review

[ad_1]

We love a good 24-70mm lens and the Canon RF 24-70mm f/2.8L IS USM is a shining example. Versatile, compact, and usually quite light, they’re probably the most essential piece of glass you’ll carry in your camera bag. Canon’s premium RF version of the standard 24-70mm focal length lens has the ability to shoot wider maximum apertures at up to f/2.8. We think it’s one of the best lenses for astrophotography you can buy, and if you own a Canon mirrorless camera it’s a brilliant lens for almost all situations. While astro purists will want something a little wider and a little faster, (maybe a 20mm with f/1.8 or even f/1.4,) this well-rounded lens will perform most nighttime photo tasks perfectly.

While many 24-70mm lenses across manufacturers are of a similar standard in terms of image quality, the Canon RF 24-70mm f/2.8L IS USM actually has a slight edge over direct competitors because it’s a touch smaller. It’s slimmer than the Nikon equivalent, for example, if ever so slightly heavier. As for the cost, though? Well, a quality 24-70mm lens with a max aperture of f/2.8 is always going to cost you more than $2000. The Canon currently weighs in at $2300 with most retailers.

[ad_2]

Frigid Sunday with 30-degree highs ahead of warmer Thanksgiving week

[ad_1]






© Provided by WJLA – Washington D.C.


Mainly clear skies overnight along with increasing gusty winds will usher in even colder air for Sunday.

Highs may not even make it out of the 30s with feels-like readings ranging from the teens and 20s. Gusty winds may exceed 30 miles per hour, especially in the mountains. Wind and wind chill advisories for far western Maryland and parts of West Virginia overnight and early Sunday.

The Thanksgiving travel week will feature a temperature uptick into the 50s.

Forecast models for the DMV are trending dry for travel Monday, Tuesday, and Wednesday.

Showers are possible late Thursday for Thanksgiving and a soaking rain is possible Friday of next week.






© Provided by WJLA – Washington D.C.


The Leonid Meteor Showers will continue to be active until Dec. 2 and we are seeing a peak now. 

The next best viewing will be from midnight until 5 a.m. Saturday. Look to the eastern sky. 

Experts are predicting anywhere from 20 to 50 per hour with the biggest burst from 1 a.m. to 3 a.m. The sky will be clear. Fingers crossed you see one! If you are into astrophotography, be sure to send any pics you get to Chime In.

Download the First Alert Weather app to stay up to date with the latest forecast.

[ad_2]

Hybrid solar eclipse: What is it and how does it occur?

[ad_1]

A hybrid solar eclipse is a very rare and strange astronomical event — and there’s one coming soon on April 20, 2023.

Talk to most eclipse-chasers and they’ll tell you that there are three types of solar eclipse. The first is a partial eclipse of the most common and the least impressive because the moon merely blocks out part of the sun sending a shadow — the penumbra — across a swathe of Earth.The second is an annular solar eclipse, where the moon blocks out the center of the sun, but leaves a circle of light from the sun visible from within a shadow called the antumbra. It’s often called a “ring of fire”. The third is a total solar eclipse where the entirety of the sun’s disc is blocked by the moon, revealing the spectacular sight of the solar corona, which can be viewed with the naked eye from within the moon’s dark shadow, the umbra. 

[ad_2]

Stunning meteor over North Island leads to hunt for meteorite

[ad_1]

Footage captured of a large meteor entering the earth’s atmosphere over the top of the North Island. Video / Supplied by Logan Carpenter

A stunning event lit up the early morning sky across the top of the North Island as a meteor crossed into the earth’s atmosphere, and astronomers are on the hunt for more sightings.

Witnesses reported a large meteor soared over the North Island at 4.26am on Saturday, being spotted from Kaikohe to Auckland, with its calculated trajectory breaking up east of Dargaville.

One of five current witness accounts on Fireballs NZ said they first noticed the meteor when the paddocks in front of them lit up in a pulsing, light green hue.

“Initially I was facing away from the object (and) I turned around thinking it was a vehicle on the road that was behind me. I saw it falling from the sky in a northerly direction where it changed from green to orange-yellow.

“My relief milker who was 30 minutes north driving southwards also saw it and asked about it on her arrival. Another person on the farm also commented. I was unable to hear any sounds as I was on a motorbike. By far the biggest event I’ve seen in the night sky before. And I’ve spent a lot of hours following cows in the dark.”

Logan Carpenter captured a “fireball in the sky” on a security camera on top of his house and felt very fortunate the camera was facing the right way at the right time.

Based in Castor Bay, Auckland the amateur astrophotographer was looking at the stars through his telescope at the time and didn’t notice the event or footage until his wife checked the home security camera the next day.

“I just love all sorts of this thing and thought wow!”

Another witness reported hearing a sonic boom that sounded like an explosion lasting five to seven seconds.

Associate Professor of Geology at the University of Otago James Scott said a sonic boom results from the meteor travelling faster than the speed of sound.

He believed there will be far more people who saw the event but have not yet logged it with Fireballs Aotearoa. With more information, examiners can then analyse the trajectory and hope to recover freshly-fallen meteorites in New Zealand.

“The key thing is that this seems to be over land and not sea, and there may be a meteorite associated with it since it travelled for several seconds in the atmosphere and therefore got low.

“The colour of the fireball relates to the ionisation of elements, principally oxygen, in the meteor trail, due to the heat build-up as the rock travels through the atmosphere.

“The last part of the path was not luminescent because either all the material was burned up, or the meteorite got to a low elevation and slowed down so much that melting of the edge of the fireball ceased and the rock then entered ‘dark flight’.

“These are the most exciting because they can drop meteorites. New Zealand has 9 so far, but it is estimated that 3-4 > 100 gm meteorites should be ‘dropped’ each year on our land mass.

There are currently no Fireball cameras in the region. The Royal Astronomical Society of New Zealand (RASNZ) has just helped to sponsor the rollout of 20 Fireball cameras across the country.

Further public reports could be loaded at www.fireballs.nz.

[ad_2]

How to create a 24-hour star trails image

[ad_1]

Star trails are relatively simple to do yet produce very striking results, especially when centred around the north or south celestial poles.

These images capture the movement of Earth as it rotates on its axis, producing star trails that are comprised of fragments of concentric rings.

If you are in the polar regions during the winter and have clear, dark skies for 24 hours, the trails will form complete circles, including the pole star Polaris – also known as the North Star – because it is not perfectly aligned with the celestial pole.

However, you don’t need 24 hours of darkness to create a 24-hour star trails image.

The above image was captured from the UK, and is effectively a 24-hour star trails image.

In this guide, we’ll show you exactly how we did it.

For more advice, read our guide on how to use a DSLR camera or our beginner’s guide to astrophotography.

Star trails captured by Adam Jeffers, Cookstown, Northern Ireland, August 2020 Equipment: Nikon D800E DSLR, Nikon 28–80mm lens, static tripod

What is a 24-hour star trails image?

Most of us will never experience a polar winter, but it is still possible to create a 24-hour star trail photograph by merging together a stack of many images taken with the same setup, from exactly the same spot, on different nights throughout the year.

Although taking a star trail image on a single night is straightforward, replicating it exactly on multiple dates, then merging them together successfully, is the challenge here.

First, the camera location must be identical.

We placed marks on the ground to ensure the tripod was in exactly the same spot, but attaching the camera to a fixed structure would be even better.

Point at the north celestial pole to give your stra trails that attractive circular focal point. Credit: Anthony Beavers.

Focus must also be the same, otherwise the width of the star trails will differ, and they won’t line up perfectly.

Circumpolar star trails can be created under moonlight, but doing so will affect the fainter stars, so try to ensure the Moon’s brightness – its phase – is consistent between sessions.

Also, when you blend your final stacked images, the overlapping regions will be brighter, so only stack what’s necessary to complete the full circle.

Pick your foreground wisely

A good foreground can really make a star trails image! Credit: James Billings

Another consideration is the foreground. It’s important to have something in the foreground for context and scale.

I had a large tree in mine, which grew during the year and then required an extra layer-masking step at the end to compensate for the different tree size.

You might want to opt for a non-organic foreground object!

Star Trails over 18th Century Loop Tower by Peter Brown, Guernsey, Channel Islands. Equipment: Cannon 1100D Camera on Tripod.

Choose some potential imaging dates that are spread across the whole year and if you have a clear sky forecast, be ready on those nights (for help, read our guide to weather forecasting for astronomy).

I took images on seven nights during the year, but only used the photos taken in February, April, July and November.

During the winter months you will have many more hours of imaging available to you, but in the summer you need to grab every minute of darkness.

If thin cloud moves through your field of view, don’t worry; the stars will still shine through.

Star trails captured on a smartphone with NightCap by Iain Todd, Bristol, UK, 26 February 2022.

Issues such as aircraft lights can be omitted from the image stack by using a program called StarStaX, which can fill the gaps.

Before you attempt to merge the stacked images, adjust the brightness and colour balance to make them as similar to each other as possible.

You may also have to apply a lens distortion correction to successfully line up the images. But all this effort is well worth it for the end result.

What you’ll need

  • A DSLR camera with a widefield lens and a high-power battery or mains power lead
  • A remote shutter release cable that locks in place
  • A sturdy tripod
  • A dew heater to prevent the lens from fogging up during long imaging sessions
  • Software for image stacking and processing, eg StarStaX for stacking, and Photoshop or GIMP for merging and processing the stacked images

Create a 24-hour star trails image, step-by-step

[ad_2]

Photo editing against the algorithm

[ad_1]

Pixel 7 Pro hazel on a table next to coffee

Dhruv Bhutani / Android Authority

Google Pixel phones have been praised and recognized for their camera prowess since the Pixel 2. Interestingly, it wasn’t the camera hardware that made them better. In fact, Google managed to beat most of the best camera phones year after year, all with average camera hardware. For example, it wasn’t until the Pixel 4 that Google started adding more than one camera to its Pixel devices. And the camera hardware didn’t really get much better until the Pixel 6 series.

What made Pixel devices so good at photography? We can thank Google’s algorithm and computational photography for such great results. In short, it was all about AI and post-processing. Google knows what generally makes an image good, and it enhances images intelligently. The trick is to improve exposure, balance highlights/shadows, increase contrast, boost colors, and so on.

Additionally, Google can recognize skies, faces, objects, pets, and many other objects. It can then enhance these sections without affecting the rest of the image. Then you have modes like Night Sight, Astrophotography, HDR, and more, which can take a series of shots and merge them together to create a single, better photo.

Google Pixel phones have been praised and recognized for their camera prowess since the Pixel 2.

Knowing most of it is thanks to editing, we’ve been wondering if it’s really all that good compared to someone who knows his way around photo editing. I’ve accepted the challenge and went against the Pixel 7 Pro to find out who edits photos better. Has machine managed to beat man in photo editing? Let’s find out together!

A little about the photo editor

Edgar Cervantes portrait by David Imel

Edgar Cervantes / Android Authority

Hi there. Edgar here! I am Head of Imaging and Photography at Android Authority and have been a professional photographer for over a decade. Most of my work revolves around product photography, with a strong focus on mobile technology. I’ve done photography for a series of publications, as well as a variety of brands in the commercial sector.

Needless to say, I have plenty of image post-processing experience and know my way around Photoshop, Lightroom, Affinity Photo, Capture One, and others.

Photographer vs Google Pixel 7 Pro: The rules

The whole idea of this challenge is that we want to showcase what a little bit of editing knowledge can do for the average consumer. As such, I won’t be going too crazy with editing, and we can consider most of this post-processing as developing photos instead. We’ll play a bit with simpler edits, like changing the exposure, contrast, colors, shadows, etc. I won’t be replacing large objects or doing anything fancy. I might spot-heal some unwanted distractions, but that’s a simple feature anyone can do. I’ll also try to limit cropping unless I feel it makes a significant difference.

The Pixel 7 renders images in a split second, but I am not a machine so I gave myself a 5-minute limit on editing time. And because we know most of you probably don’t have paid photo editing software, I did it all with Lightroom. You can get the mobile Lightroom version and use most features for free. If you want a completely free alternative, Snapseed is just as good.

Furthermore, I did not shoot any of these photos. These were captured by our writer C. Scott Brown, an amateur hobbyist photographer with a more casual perspective on photography. Simply said, he is an average camera phone user. He shot all images in both RAW and JPEG. I will manually edit the uncompressed, unaltered RAW photos, and the Google Pixel 7 Pro will handle the JPEG post-processing.

Photographer vs Google Pixel 7 Pro: Let’s compare!

Any camera, including the Pixel 7 Pro’s, will get its best results with ample lighting. The sun is a powerful light source, so let’s take a look at some daylight photos first to see what we’re working with.

Both of these cactus images seemed a little dull and slightly under-exposed, so I increased the exposure and contrast to give the image more depth. I also lowered the highlights and increased the shadows to give it a more balanced look. The colors needed a bit more oomph, so I went ahead and increased the vibrance to give it a more fun aesthetic. Because Lightroom allows for automatic sky selection, I was able to focus on reducing exposure and highlights on the sky, while deepening the shadows and making the temperature cooler to make the blue sky pop.

On the prickly pear fruit image, I also increased the sharpness and texture to enhance the detail a bit.

I was more playful with this roses shot, as I noticed it had plenty of colors to play around with. Also, while the bigger flower was the clear subject, it got mixed up with everything going on in the image. I fixed the exposure and increased the contrast to give the image more depth. Then I increased the vibrance and saturation to enhance the colors. When all was done, I decided to give more emphasis on the main flower by making everything else just a bit darker. I used a mixture of vignetting and the brush tool to do this. When I had selected all but the flower, I reduced the exposure a bit.

I went a bit lighter on this flower, as all I wanted was to make it pop a bit more. I made the temperature warmer and increased the color vibrance. After that, I made slight edits to the exposure and lowered the highlights.

This park photo is one of my favorites. As soon as I saw it, the image of what I wanted it to look like popped right into my head. The image was great, but the Pixel 7 Pro really didn’t make the best out of this photo. In fact, it all looks a bit dead to me, which is not what a park should look like in real life. It needs to feel alive, colorful, and warm. Something that takes you away from the dryness and coldness of the city. It had to be almost like a cartoon or painting.

I increased the exposure and contrast to make everything pop more. I lowered the highlights to ensure the sky wasn’t too bright. Now, the magic happens when you edit the colors. I increased the vibrance to highlight the colors more, then moved the saturation up to deepen the colors and give the image a cartoon-esque look. It was also important to make the temperature warmer, to give everything that warm sunny day feeling that’s so inviting.

Like other images in this post, I smart-selected the sky and made the temperature cooler for a blue sky. I also removed the airplane trail in the top-right corner.

How about a selfie? And not just any selfie! This is a portrait mode selfie, with a blurred background and all. You’ll be glad to know you can accomplish this bokeh effect in post-processing. This is the only image in which I got close to my 5-minute limit, so just know that creating fake bokeh takes a bit.

Lightroom offers a person selection tool, so I went ahead and used it to outline our friend Scott, here. After this, I had to invert the selection, so everything except Scott was selected. As you can see, the Pixel-processed image has some outlining issues in the helmet’s top and the straps. Lightroom’s selection wasn’t perfect here either, but I added those parts manually using the brush tool. When I had Scott outlined, I went ahead and reduced the sharpness all the way down on the selected area. I also increased noise reduction as much as I could. This created the soft, bokeh effect everyone loves so much.

Of course, I also made general enhancements to exposure and color.

This shot is very similar to the other image of the park. I increased the exposure and contrast, reduced highlights, and enhanced colors. Additionally, I created a bit more of a shadowy area in the lower section of the image. It’s a slight one, but it helps make you feel like you’re there, enjoying the tree’s shade and looking at the landscape.

Not much to do here. I increased the exposure and contrast, while also increasing the texture and sharpness to make all that detail in the wood stand out. I also made the temperature warmer for a more realistic daylight look.

I just couldn’t leave the greens and purples so muted. This gorgeous flower had to stand out more. After fixing the exposure settings, I raised the vibrance and saturation just a bit. I also deepened the blacks to give everything a richer, darker look. It just makes plants look more luscious.

This pic reminded me a lot of that Windows XP wallpaper, albeit in yellow instead of green. I wanted the picture to assimilate that look, but more subtly. The first step was to fix exposure and increase vibrance. I also selected the sky to make it bluer, but with a more aqua tone. The temperature was warmer, and the photographer’s shadow was removed.

Aside from exposure settings, I went manual on this lake shot to make select edits to the sky and the water. I made both bluer. Additionally, I made sure to make the reflection of the mountain more green.

This cabin was a simpler edit. It was mostly fixing exposure, reducing the highlights, making the temperature warmer, and increasing the color vibrance to make the painting colors stand out.

Jack-o’-lanterns are naturally connected to Halloween. This image needed to be darker and gloomier, while also highlighting the intensity of the flame inside. It’s all about the contrast. I increased the exposure, but reduced the highlights and whites. I also deepened the blacks and added a smooth vignette around the frame, which I cropped to center the pumpkins. A warmer look goes better with Halloween and the fire, so I changed the temperature accordingly and increased the vibrance to bring the colors to life.

I did something very similar to this image. I wanted to keep its dark essence while enhancing it. So I lowered the highlights and increased the shadows a bit. I also got rid of that red hue in the fence, making it more naturally brown by cooling the temperature of the photo.

This particular photo was very complex, as the camera shot almost directly into the sun, creating a very high contrast that usually kills both the highlights and the shadows. I’m happy with my results, though. First, I had to even out the exposure, which I did by reducing contrast, highlights, and whites. You should also increase shadows. I wanted the fence and trees to look natural, so I balanced the washed-out look by deepening the blacks.

The photo still looked a bit muted, thanks to all the contrast reduction. I used the dehaze tool to deepen colors further. I also brought out more detail in everything by increasing the texture. Once again, the sky was a bit too mute, so I selected it and made it bluer.

Which photos did you like better?

142 votes

What do you think of the results? Of course, photography is highly subjective, and we all have different thoughts on what is aesthetically pleasing and what isn’t. My biased opinion is that machine is far from beating man in photography. This is because there is no such thing as a perfect algorithm or solution for creativity.

There is no such thing as a perfect algorithm or solution for creativity.

We all have a different idea of how a photo should look, and it changes depending on many factors, including your mood, the surrounding light, memories, psychology, and more. Learning to edit ensures that photos end up just the way you want them, not how an algorithm thinks you’ll like them.

[ad_2]

Deal Alert: Save $100 on the Tokina atx-m 11-18mm f/2.8 for Sony E

[ad_1]

Looking to expand your lens collection for your crop-sensor Sony mirrorless camera or a perfect holiday gift for a Sony shooter in your life? The Japanese lens manufacturer Tokina has an instant savings deal just for you. For a limited time, get $100 off the new Tokina atx-m 11-18mm f/2.8.

Announced in mid-September 2022, the Tokina atx-m 11-18mm f/2.8 E is the first super-wide-angle zoom lens designed exclusively for mirrorless cameras.

Fully compatible with Sony E-mount APS-C bodies, the lens features a constant f/2.8 aperture and a 17-27mm equivalent focal range in 35mm full-frame terms, providing an angle of view of between 104 and 77 degrees.

Features and specs of the lens include a compact and lightweight body, 13 elements in 11 groups (including two aspherical elements and two super-low dispersion elements), a 9-bladed aperture diaphragm, a stepping motor, a close focusing distance of 0.62 feet (0.19m), a filter size of 67mm, a macro ratio of 1:9.2, and a micro USB port for future firmware updates.

“Engineered from the ground up, Tokina has brought the legendary, multiple award winning optical technology from the ATX 11-16mm f2.8 to mirrorless cameras,” Tokina says. “The new Tokina atx-m 11-18mm f2.8 sets the new standard for compact, fast aperture super wide-angle lenses specifically engineered to meet the high performance requirements of today’s crop-sensor mirrorless cameras.

“At the heart of the new optical design are two aspherical elements combined with two super low-dispersion lenses that suppresses chromatic aberrations, nearly eliminates coma at the edges at f2.8, and produces superior contrast and color reproduction. Making the lens an excellent choice for any type of photography including astrophotography.”

Tokina says the the atx-m 11-18mm f/2.8 E is ideal for landscapes, group photos, environmental portraits, architecture, astrophotography, automobile photography, street photography, documentary videos, and vlogging.

Here are a few official sample photos captured with the lens (a larger selection of images can be found on Tokina’s website):

Tokina 11-18mm f/2.8

Tokina 11-18mm f/2.8

Tokina 11-18mm f/2.8

While the lens ordinarily carries a price tag of $599, Tokina is offering $100 in instant holiday savings, allowing you to purchase one for just $499 while the deal lasts.

Head on over to the Tokina USA online store if you’d like to order the lens. Shipping is free on orders of $75 or more.

[ad_2]

How to produce space images using James Webb Space Telescope data

[ad_1]

We live in rather amazing times, with private citizens travelling to space and citizen scientists contributing to the knowledge base of professional astronomy.

Now, just as it did with the Hubble Space Telescope, NASA has made data from the James Webb Space Telescope (JWST) available for download, for anyone to process for themselves.

Below is our final image of NGC 3132, processed from raw Webb data. Here we’ll walk you through how to do it, step-by-step.

See the James Webb Space Telescope’s latest images for inspiration and read our guide to image processing for more advice.

NGC 3132, captured by the James Webb Space Telescope and processed by Warren Keller.

NGC 3132, captured by the James Webb Space Telescope and processed by Warren Keller.

How to get raw data from James Webb Space Telescope

Visit the MAST Portal, an archive named after Barbara Mikulski, a retired US senator and staunch supporter of space exploration.

Clicking on Advanced search at the top of the page opens a new window.

On the far right, type ‘JWST’ in the Mission box and press Enter.

At far left, under Columns, select Release date and scroll down to the box of the same name.

Type ‘2022-07-13 14:00:00’ as the beginning date and time – 13 July 2022 being the day on which the first observations were released to the public.

With the end date at default (the year 2050), note the number of Records found at the top of the page.

At the time of writing, there were already over 120,000 in the archive.

As NGC 3132 is our target and was one of the first data sets released, entering an end date of ‘2022-07-13 16:00:00’ displays a manageable 2,325 records.

Clicking Search at top left reveals the individual file folders and you’ll need to narrow the field yet again.

Under Instrument in the Filters box at left, choose the near-infrared data by checking NIRCAM.

Depending on the width of your monitor and browser window, you may need to use the scroll bar at the bottom to slide over to the Target name column.

Also note the Filters column. I found F187N, F356W and F444W to be the most useful filters.

Click on the floppy disc icons of records 13, 15 and 18 to download the zipped folders to your computer (see image below).

process webb telescope data step 01

Choose your colours

Unzip the folder to a suitable location on your computer then open the parent folder, then a second folder with the same name and, finally, the JWST directory.

Next, open the Nircam folder, discarding all but the FITS file ending in ‘i2d.’

Double-clicking that file will open seven individual files in your program of choice, mine being PixInsight.

Of these, the seventh and last to open has a _SCI suffix and is the only file that you’ll need.

When finished, you will be left with three files to post-process, each ending in ‘i2d.fits’, with the filter names f444w_f470n, f356w and f187n.

Those of us who process narrowband images will understand the concept of ‘mapping’ data that’s invisible to the human eye to colours that we can perceive.

The same is true here. Rather than the emission lines of the Hubble palette, we’re now dealing with Webb’s near-infrared information.

How best to assign these filters? For guidance, search online for ‘NIRCam Filters – JWST User Documentation’ or visit the NIRCam Filters page.

There you will find a full-colour graph illustrating the transmission lines of each filter from short to long wavelengths.

While there’s no single, correct way to proceed, it made sense to me to assign the shortest wavelength data (F187N) to the blue channel, as blue is on the shorter end of the visible spectrum.

Conversely, I mapped the long wavelength F470N data to red and the medium F356W to green.

I found this to be the most aesthetically pleasing colour blend for this particular object, and strikingly similar to the Hubble SHO palette (see image below).

process webb telescope data step 02

After marrying the channels with PixInsight’s Channel combination process, the images were cropped of edge artefacts and stretched with Histogram transformation (HT).

Transferring a Boosted autostretch from the STF (Screen transfer function) to HT with the RGB channels unlinked provided a great start to good colour.

PixInsight’s SCNR (Subtractive chromatic noise reduction) was then applied to reduce an undesirable green cast in the stars (see image below).

process webb telescope data step 03

From there, a range mask was applied, so that contrast, sharpness and colour saturation could be boosted in the nebula only.

As the data was so clean, no noise reduction was needed for our final image, which you can see at the very top of this page.

If you’re a Photoshop-based processor, be sure to view Nico Carver’s excellent tutorial, ‘Can I process the JWST data better than NASA?’ on his ‘Nebula Photos’ YouTube channel, which you can view below

Processing JWST data: 3 quick tips

  1. Knowing the release date of a particular data set will help narrow your records search considerably.
  2. Note that the strength of the NIRCam’s infrared signal may render noise reduction unnecessary.
  3. While gathering the data is a rather tedious process, the end result is well worth the effort!

Have you processed your own James Webb Space Telescope data? We’d love to see it! Get in touch by emailing [email protected].

This guide originally appeared in the November 2022 issue of BBC Sky at Night Magazine.

[ad_2]

Twitter bans astrophotographer for three months over an “intimate” shot of a meteor

[ad_1]

Can you imagine seeing anything “dirty” in a photo or video of a meteor? Yeah, neither can I. However, Twitter can, and it banned an astrophotographer this August because of that.

Astronomer and astrophotographer Mary McIntyre published a video of a meteor she took during the Perseid meteor shower. Twitter flagged it as “intimate content,” which resulted in banning the photographer for the whole three months!

The Perseid meteor shower is visible from mid-July to late August, and Mary took her photos on 11 August in Oxfordshire, UK, using a Canon 1100D and a kit lens. The meteor she shot left an ionization trail behind, making it quite a sight! “I honestly didn’t expect to see any of those with so much moonlight,” she wrote on Twitter. She posted a short video she composed from a fireball shot and seven subsequent images… And Twitter saw it as something that wasn’t allowed on the platform.

Here is the #IonizationTrail from the #Perseid #Fireball at 01:37 BST / 00:37 UT 13/08/22 from #Oxfordshire. Visually it was epic! Canon 1100D 18-55mm lens 8sec ISO-800 f/3.5. Video is made from the fireball + 7 subsequent images #Perseids2022 #PerseidsMeteorShower https://t.co/jSw3OTSw15

After Twitter flagged her video as “containing intimate content,” her only option was to delete the tweet. If she had accepted, she would have had to agree that she’s broken the rules. “It’s just crazy,” Mary told the BBC. “I don’t really want it on my record that I’ve been sharing pornographic material when I haven’t.”

Since she refused to delete the tweet of the sexy meteor, Twitter “rewarded” her with a three-month ban. She tried to appeal the decision but had no luck. Her account remained visible for three months, but she wasn’t allowed to access it.

Advertisements

“I miss the interaction,” Mary said, adding that she felt “a bit cut off from the astronomy world.” But since the ban was placed in August – she’s now back on the platform.

I’m back!!!!!!!!!! After 3 months of being blocked due to my Perseid meteor video being flagged as intimate media, I wasn’t able to get my account back unless I admitted to breaking the rule. Huge thanks to the BBC & to everybody who has been tagging support for me 🙂

Speaking with the BBC, tech commentator Kate Bevan said that this was an example of the limitations of AI tools that Twitter and other social media use for content moderation. “AI tools are OK for quick and dirty decisions, but it shows that content moderation at scale is really difficult – both for humans and for the AI tools that are supporting them,” she said. “It’s even worse when there are no humans available to review bad AI decisions. Apart from the unfairness, it means the AI model isn’t getting feedback, so it will never learn to improve.”

This reminded me of my favorite story ever, when an AI tool for detecting explicit content kept flagging photos of deserts as “nudity.” Comments on that article were brilliant (“Send dunes” still cracks me up), but honestly, I can see how AI can identify some dunes as nudes. But I can’t understand how on earth even artificial intelligence can see anything dirty in photos of a meteor. How?! Do you have any idea? Enlighten me in the comments.

[via the BBC]



[ad_2]