Hollywood is the home of American cinema and a place where many dream to live and thrive. But, more often than not, we see that there is a dark underbelly underneath all the glamour and success. Which of these darker depictions and deconstructions of the Hollywood film industry and general lifestyle is your favorite? Discuss here.