What the fuck is wrong with American culture? We can watch men, women, and children butchered in various vile manners, watch seemingly endless parades of programs about serial killers, vampires, and other creatures. But the slightest shot of a nude breast or bum is offensive? To who? Is our society so repressed that we act more like the Inquisition, than the freedom-loving people we are supposed to be? The nude body is taboo , but a dismembered one isn't? WHAT THE FUCK!!!!!! Ok..... rant over.