Preview A recent survey has revealed that most Americans believe that the news media, more than any other institution, have a negative impact on their country – findings that are hardly surprising, according to media analyst Lionel.
Read Full Article at RT.com
Source: RT

Leave a Reply

Your email address will not be published. Required fields are marked *