Watch the video at the link.

A Wall Street Journal investigation found that TikTok only needs one important piece of information to figure out what you want: the amount of time you linger o

Source: Investigation: How TikTok’s Algorithm Figures Out Your Deepest Desires

The primary business model of the Internet had been surveillance. Today it is automated psychological analysis to try and learn your inner thoughts.

Youtube and Facebook behave similarly. This is why some think these “services” may tag you with attributes that even you did not know you had.

I had previously downloaded the FB database of attribute keywords and if I remember correctly now, they thought I was both a young person and in my 50s, and a member of both the Democratic and Republican parties simultaneously (I am not a member of any political party).

While we can think, whew, they do not really know me, that might be the wrong conclusion as they begin to feed us more and more content based on their erroneous attributes. Ultimately, they use what they think they know in ways that may not be in our own best interests.

Related:

I recently began using a Twitter account I set up long ago but had barely used. Within a short span, Twitter began inserting tweets from others that I did not subscribe to. – in fact, they were from “experts” discussing Covid-19 topics. Twitter was deliberately trying to shape my views on Covid – in other words, the Twitter algorithm is operating in propaganda mode giving me messages intended to cause me to adopt someone’s agenda.

Similarly, this winter, Youtube began inserting recommended videos on racism and critical race theory. Not just on occasional recommendation but lots and lots and lots, every day – even though sociology or psychology are not on my hot topics list although I do subscribe to channels from POC, of course.

Youtube announced they would intentionally work to shape the public’s narrative around racism – in effect, Youtube is a propaganda operation, recommending content specifically intended to influence our behavior.  Related to this, the videos they were suggesting were not the best on the topic: most are 1-2 years old and have been viewed only 100 to 300 times. After a month, l began to flag the videos to tell the algorithm “I am not interested”. Youtube replied with a message saying they would recommend fewer such videos. But in 4 weeks into flagging the videos, they continue to recommend these videos. In other words, they have overwritten their own recommendation algorithm to ignore the user’s request – indicating that Youtube is, without question, now functioning as a propaganda operation.

By EdwardM

Leave a Reply

Your email address will not be published. Required fields are marked *