Taylor Swift has entered the chat. And so have singer Charli XCX, brat memes, cats and dogs, offbeat humor, absurd AI-generated photos, and firestorms of misinformation.
It's been a bizarre time for the presidential elections. University of Michigan School of Information researchers-experts on social media, misinformation and disinformation-have watched it all unfold.
"Social media has played a bigger role, I think, than we saw in the last election," said professor Cliff Lampe, a social media expert.
More celebrities have made their political voices heard. For instance, Taylor Swift signed off on her endorsement letter to the Harris-Walz campaign with "Childless Cat Lady," in a dig against JD Vance. Donald Trump Jr. posted an AI-generated photo of Donald Trump on a giant cat with the caption "Save our pets!!!!!"
The Harris-Walz campaign has targeted Gen Z Americans on their home turf: TikTok.
Could this be a good thing?
"Younger people have been really demotivated to come out and vote," says assistant professor Chelsea Peterson-Salahuddin, whose research focuses on the culturally specific ways marginalized communities-most often Black women, femmes and queer folks-engage with mass and digital communications technologies to seek information and build community. "The Harris campaign has tapped into TikTok meme culture and attempted to capitalize on a younger vote."
The 2024 election-and the prior two presidential elections-have demonstrated how technology can both undermine and empower democracy. But associate professor David Jurgens, an expert in artificial intelligence and social media, notes that one bedrock of democratic discourse-rhetoric-seems stuck in perpetual decline.
"The degree of eloquence in making campaign arguments has dropped," he said. "I think the 2016 and 2020 elections were effective in raising the anger rhetoric, but in 2024, the silliness rhetoric has come onboard, as well."
Although the memes are sometimes funny, their ability to parade as official sources of truth has become a pressing concern.
UMSI experts have studied the impact of misinformation-false or misleading information shared without harmful intent-on democracy.
"Social media allows for the rapid spread of disinformation," Lampe said. "If we can't even agree on fairly straightforward things, and truth becomes entirely subjective to your identity affiliation, it can be super harmful to democracy overall. The 2020 claims of the stolen election, for example, were proven wrong multiple times. But it doesn't matter, right? There's definitely been a blow to democracy."
UMSI experts said it will take media literacy, education and government regulation to combat the pervasive spread of falsehoods and restore trust in public discourse. The widespread use of generative AI, especially, has changed the sophistication and quality of the information people are capable of producing. It also raised our level of skepticism in engaging with information we see online.
Assistant professor Nazanin Andalibi said the United States needs greater regulation and social media companies' incentive structures need to change to better protect against harmful speech.
"We can't trust tech companies to be of any help here," she said, regarding companies self-regulating. "Their obligation is to make money."
Andalibi would like to see a system that attends to a set of values and a social media governance model that has different incentive structures and motivations than currently exist.
There is no quick fix for these issues, but election years have a way of highlighting what's at stake when technology is unrestrained and moving faster than society can control it.
"We need to be thinking critically about what types of information infrastructures we need to build back the trust that has been lost," said assistant professor Matt Bui, who studies data justice and activism. "We need infrastructures that help communities form but also foster a sense of cohesion and unity."