So i googled my question but i couldn't find anything. I called it American tv but i think it's in Brittain aswell or atleast that's where i asume the series come from! [b]Sorry if this offends anyone, I don't mean to.[/i]
But the question is When i watch series such as Game of Thrones or True Blood i see women naked showing pretty much everything, ofcourse not detailed such as in adult movies~ but rather exposed still. But you don't really see men naked more than showing their chest and sometimes a butt.
Is there any reason for this? sorry if this is a stupid question but i'm really curious
Edit: Thanks for the answers! It's a bit more clear now