[ad_1]
Can you imagine seeing anything “dirty” in a photo or video of a meteor? Yeah, neither can I. However, Twitter can, and it banned an astrophotographer this August because of that.
Astronomer and astrophotographer Mary McIntyre published a video of a meteor she took during the Perseid meteor shower. Twitter flagged it as “intimate content,” which resulted in banning the photographer for the whole three months!
The Perseid meteor shower is visible from mid-July to late August, and Mary took her photos on 11 August in Oxfordshire, UK, using a Canon 1100D and a kit lens. The meteor she shot left an ionization trail behind, making it quite a sight! “I honestly didn’t expect to see any of those with so much moonlight,” she wrote on Twitter. She posted a short video she composed from a fireball shot and seven subsequent images… And Twitter saw it as something that wasn’t allowed on the platform.
After Twitter flagged her video as “containing intimate content,” her only option was to delete the tweet. If she had accepted, she would have had to agree that she’s broken the rules. “It’s just crazy,” Mary told the BBC. “I don’t really want it on my record that I’ve been sharing pornographic material when I haven’t.”
Since she refused to delete the tweet of the sexy meteor, Twitter “rewarded” her with a three-month ban. She tried to appeal the decision but had no luck. Her account remained visible for three months, but she wasn’t allowed to access it.
“I miss the interaction,” Mary said, adding that she felt “a bit cut off from the astronomy world.” But since the ban was placed in August – she’s now back on the platform.
Speaking with the BBC, tech commentator Kate Bevan said that this was an example of the limitations of AI tools that Twitter and other social media use for content moderation. “AI tools are OK for quick and dirty decisions, but it shows that content moderation at scale is really difficult – both for humans and for the AI tools that are supporting them,” she said. “It’s even worse when there are no humans available to review bad AI decisions. Apart from the unfairness, it means the AI model isn’t getting feedback, so it will never learn to improve.”
This reminded me of my favorite story ever, when an AI tool for detecting explicit content kept flagging photos of deserts as “nudity.” Comments on that article were brilliant (“Send dunes” still cracks me up), but honestly, I can see how AI can identify some dunes as nudes. But I can’t understand how on earth even artificial intelligence can see anything dirty in photos of a meteor. How?! Do you have any idea? Enlighten me in the comments.
[via the BBC]
[ad_2]