According to Mary Meeker, we’re now uploading and sharing more than 500 million photos a day. By this time next year it will be a billion a day.
Which is awesome — pretty soon I won’t need to feel self-conscious about sharing so many pictures of my kids on Facebook. (I’m pretty darn adorable, just for being related to them — right?!) My narcissism, at least on a relative scale, will experience a refreshing decline as greater narcissists outpace me.
But here’s the rub. The vast majority of those photos are posted to ad-supported sites — Facebook, Instagram, Twitter and Tumblr — or apps that are free to consumers (such as Snapchat) and therefore will likely pursue some kind of ad model eventually. Advertisers almost always follow consumers to a new media property once the sheer number of consumers gets too large to ignore. Take the recent rise of WorldStarHipHop. According to American Public Media’s Marketplace:
In recent days, ads for Fiber One, Walmart and Bloomingdale’s have appeared on WorldStarHipHop. O’Denat [WorldStarHipHop's founder] says the site’s racier content used to be a problem for advertisers.
“Advertisers were afraid of us,” he said. “We were kind of risque. We had girls in thongs, fights. Not until later on they decided, well, the site is too big. We’re just gonna have to work with them.”
When Facebook landed in hot water this week, though, it wasn’t for thongs or fights. It acknowledged that its “systems to identify and remove hate speech have failed to work as effectively as we would like, particularly around issues of gender-based hate.” More specifically, as one advocacy group brought to light, Facebook ran advertising on pages with names like “Violently Raping Your Friend Just for Laughs.” Oy. I’m betting Facebook has already rolled out a solution to prevent this kind of thing from happening again — software has gotten pretty good at reading text on a page and understanding when that text is objectionable.
Managing brand safety is really hard to do, though, when it comes to photo content at enormous scale. It’s still a whole lot easier to flag the word “rape” than it is to recognize a visual depiction of the crime inside one of those hundreds of millions of new photos that its members publish to the site each day. Advertisers are more than eager to pitch their wares to the enormous audiences at Facebook, Instagram, Twitter, Flickr, Tumblr and Snapchat — assuming they won’t tarnish their brands in the process. Figuring out that last part, especially as it pertains to user-submitted photos, is fast becoming a ginormous marketing opportunity. A billions-upon-billions of dollars opportunity.
(Disclosure: Until recently I worked at Luminate, one company that’s working on aspects of image recognition, including a visual brand watch system. I hope they succeed, partly because the online ecosystem will benefit from happy, confident advertisers, and partly — let’s be honest here — because those cute kids you see too often on Facebook want to go to college some day.)