We’ve all seen beauty and fashion ads that are so ridiculously Photoshopped it seems as though the company used aliens as models. Then there are the ads that are much craftier, skillfully using Photoshop to render photos of already beautiful people into works of art. And that’s when the dangers of Photoshop start to take effect, creating unrealistic standards of beauty.
Now, researchers have developed a new tool that is actually able to detect the level of retouching that went in to creating an image. Wired reports that a pair of researchers from Dartmouth University have presented their new technology in a recent Proceedings of the National Academy of Sciences study.
According to Wired:
“[Forensics specialist Hany] Farid and doctoral student Eric Kee debut a computational model developed by analyzing 468 sets of original and retouched photographs. From these, Farid and Kee distilled a formal mathematical description of alterations made to models’ shapes and features. Their model then scored each altered photograph on a scale of 1 to 5, with 5 signifying heavy retouching.”
Farid and Kee reportedly began researching their model after the U.K. took a stand against Photoshop in advertisements, with plans to mandate that altered images must be labeled as such. Helming that initiative is Liberal Democrat MP Jo Swinson, who’s been vocal about her objections to the unrealistic images. “Pictures of flawless skin and super-slim bodies are all around, but they don’t reflect reality,” Swinson told the Guardian this past summer. “Excessive airbrushing and digital manipulation techniques have become the norm.”
Which caused the wheels to start turning in Farid’s mind: “It’s an interesting scientific problem: How much is too much? That got us thinking about whether we could quantify this.”
While Farid and Kee haven’t answered the conceptual question of how much is too much, at least they’ve provided the tools for determining how much Photoshop has gone into a particular ad. You can test it out for yourself at Dartmouth’s website.