Apologies, this will be obvious to some, but I have seen enough so-so images on social media to convince me there are others who could post much better pictures if they took just a little bit more care. Smartphone cameras are so good these days that everyone is a photographer. But clearly, everyone isn’t a photographer.
So I thought I’d explain how I created the photo above, taken and edited on an iPhone 13 Pro (apart from one final modification, which we’ll get to t the end). I won’t go into massive detail, but assume people have at least looked at the editing functions available on the iPhone. They’re surprisingly powerful.
I was walking out of Senate House in London on a wet October night when I noticed a large pile of fallen leaves, a street made shiny by the falling rain, and the occasional pedestrian with an umbrella. I pulled out my camera and snapped the image below. I did this quite quickly, just waiting a few moments for someone with an umbrella to walk into the shot.
I held the iPhone close to the ground so that the pile of leaves would provide a foreground that screamed Autumn and frame the bottom of the shot. The railings on the right framed the picture on that side.
There’s quite a lot of empty, uninteresting space on the left side of the image, so I cropped closer. This also allowed me to position the pedestrian – the subject of the photo – close to the one-third lines that, for reasons that remain largely mysterious to me, helps to achieve a more balanced shot. It gives the subject room within the image and an interesting position. Placing the subject at the edge of the frame tends to make for a less harmonious composition.
The initial photo was also dark but the editing tools allow you to lighten dark areas in ways that are subtle enough to not seem unrealistic. There’s a lot of information captured on the sensor that isn’t necessarily displayed via the camera’s albeit pretty smart automatic settings. On the ‘Adjust’ tab of the editing suite, I hit the ‘Auto’ button (the one with the magic wand). This helped to lighten the shadows, and altered a few other settings. But you can go in and tweak further any of the settings available here. The main ones that I adjusted were to lighten the shadows even more (to about 82 out of 100) and to a a bit of vignetting, which darkens the edge of the image.
On the Filters tab, I selected ‘Vivid Warm’ to enhance the orange-brown tones of the fallen leaves.
The image achieved at that stage looked pretty good and I posted it on Bluesky, where it was warmly received (40 likes).
However, The presence of the black car driving past the pedestrian bothered me – it was a distraction. If I’d thought about it at the time, I could have waited a few seconds more for it to disappear into the distance. But I was in a rush.
Instead, I turned to the erase tools that are available within Adobe Lightroom (for which I pay £10 a month to have on my laptop and iPhone). These are AI-powered so all you need to do is roughly mark out the bit of the image that you want to remove and it will do a decent job of figuring out what should have been visible if the car wasn’t there. As you can see from the final image, shown at the top of this most, it remarkably good at this. This clever erase function is the best use of AI that I’ve come across!
And that’s all there is to it! Except of course, it isn’t. It takes practice to see in your mind’s eye the image that you might be able to make of the scene in front of you. You need to think about where you put the camera so as to create the most interesting composition. Sometimes that also means waiting – for someone to walk into or out of shot (look out especially for clutter in the background), or for the sun to come out. You can correct or even erase a multitude of errors with the editing software, but it is more satisfying to start by capturing something close to the picture you intend.