For three months before the 2004 U.S. presidential election! researchers studied how the brains of decisive Democratic and Republican voters processed contradictory information. While brain activity was measured with functional magnetic resonance imaging! participants were shown fictional but realistic statements from their own and opposing candidates. For example! a politician promises to cut taxes. Then! once in power! the politician makes contradictory statements and actions that suggest dishonesty! such as not cutting taxes. For example! “there wasn’t enough money in the state budget to cut taxes.” Participants were then asked to rate the consistency of the first statement and the contradictory action. The behavioral analysis showed an emotionally biased pattern of reasoning. While participants rejected or rejected obvious contradictions for their own candidates! they were more likely to detect inconsistencies in the opposing candidate.
In digital medi users may tend to accept information
that is consistent with or supports their own thoughts and beliefs without questioning it. Questioning news and information is only done against tunisia whatsapp number data 5 million information that is inconsistent with their own thoughts and beliefs. This situation is called “Confirmation Bias”.
The algorithms that form the working principles of socia
l networks determine the interests of users and offer personalized content instead of random content. Algorithms interfere with the flow of information shared on social networks and confine the person to an environment consisting only of their interests. This situation is explained by the concept of “Filter Bubble”. Filter bubbles cause users to be constantly exposed to similar information! feed on the same ideas! and remain closed to different news sources! thus becoming open to disinformation.
Users are trapped in a bubble with content that feeds the selection of this domain their own thoughts and prejudices and only creates their own interests. In these bubbles! they are surrounded by views that are similar to their own and are left echoing their own voice and voices that are similar to their own.
This effect causes users in digital environments ao lists to reinforce their existing views and beliefs! remain closed to different perspectives! fail to develop a critical perspective! move away from objectivity! and lose the truth over time.
Disinformation and lies find the opportunity to multiply and deepen precisely in this environment.