Photos taken by cell phone cameras increasingly can't be trusted as evidence of the state of something. Let's say you take a picture of a car that just hit a pedestrian and is driving away.
Pre-AI, your picture might be a bit blurry, but say, it's discernible that one of the headlights had a chunk taken out of it; it's only a few pixels, but there's obviously some damage, like a hole from a rock or a pellet gun. Police find a suspect, see the car, note damage to the headlight that looks very close, get a warrant for records from the suspect, find incriminating texts or whatnot, and boom, person goes to jail for killing someone (assuming this isn't the US, where people almost never go to jail for assault, manslaughter, or homicide with a car) because the judge or jury are shown photos from the scene, taken by detectives in the street of the person's driveway, and then from evidence techs nice and close-up.
Post-"AI" bullshit, the AI sees what looks like a car headlight, assumes the few-pixels damage is dust on the sensor/lens or noise, and "fixes" the image, removing it and turning it into a perfect-looking headlight.
Or, how about the inverse? A defense attorney can now argue that a cell phone camera photo can't be relied upon as evidence because of all the manipulation that goes on. That backpack in a photo someone takes as a mugger runs away? Maybe the phone's algorithm thought a glint of light was a logo and extrapolated it into the shape of a popular athletic brand's logo.
I’d just like them to fix the problem where license plates are completely unreadable by most consumer cameras at night. It’s almost as though they are intentionally bad. (The plate ends up as a blown out white rectangle.)
Photos taken by cell phone cameras increasingly can't be trusted as evidence of the state of something. Let's say you take a picture of a car that just hit a pedestrian and is driving away.
Pre-AI, your picture might be a bit blurry, but say, it's discernible that one of the headlights had a chunk taken out of it; it's only a few pixels, but there's obviously some damage, like a hole from a rock or a pellet gun. Police find a suspect, see the car, note damage to the headlight that looks very close, get a warrant for records from the suspect, find incriminating texts or whatnot, and boom, person goes to jail for killing someone (assuming this isn't the US, where people almost never go to jail for assault, manslaughter, or homicide with a car) because the judge or jury are shown photos from the scene, taken by detectives in the street of the person's driveway, and then from evidence techs nice and close-up.
Post-"AI" bullshit, the AI sees what looks like a car headlight, assumes the few-pixels damage is dust on the sensor/lens or noise, and "fixes" the image, removing it and turning it into a perfect-looking headlight.
Or, how about the inverse? A defense attorney can now argue that a cell phone camera photo can't be relied upon as evidence because of all the manipulation that goes on. That backpack in a photo someone takes as a mugger runs away? Maybe the phone's algorithm thought a glint of light was a logo and extrapolated it into the shape of a popular athletic brand's logo.