Wall Street Journal is reporting that the photo- and video-sharing app unveiled a more-detailed standard for images, aimed at curbing pornography and harassment.

The number of Instagram’s monthly users has exploded from 30 million to 300 million since Facebook bought the app. With that growth have come questions around how the site should police bullying and potentially offensive content. Instagram doesn’t screen images before posting, but reviews those that prompt complaints from users, and removes those that violate its guidelines.

Read the full article here >>