Do we, as a public, want companies like Facebook to be able to do large scale human subject research outside the regulatory and normative framework that academia has developed? What kind of norms and regulations do we want for new practices like A/B testing and the power it entails? How can we safeguard that large-scale, fine-grained human subject research – both by corporate entities and individuals – does not harm the individual and public good?
Lots of discussion right now about the Facebook newsfeed experiment. The human subject research part is certainly a big issue. Others claim that the A/B testing Buzzfeed, Amazon, et al constantly do is the same thing, which is incorrect.
Facebook claims to be a utility to keep up with friends. There is an expectation that you should see the most important posts, whether they are good or bad, sad or happy. That's what it means to stay in touch with friends. Messing with what is shown means you are manipulating people's relationships, which should not be done lightly. How I feel about the posts I see should not matter. When my friends post something important, I should know.
The relationship between Amazon, Buzzfeed, etc and their users is different. They are not communication tools or social networks. On Facebook, friends post content, Facebook organizes that information, and then I see it. On Amazon, Buzzfeed, etc there are only two parties involved, and it's really just about me as the user. They post things I can decide to read or not, that's it.
So the newsfeed experiment, to me, is another big step in eroding trust and accountability between Facebook and it's user. You just really can't be sure anymore if Facebook is really helping you stay in touch with friends, or whether it's just trying to squeeze another click out of you by showing you whatever emotional post of moment.