even russian influence campaigns don’t do it this way. it’s probably a split between this kind of thing being too expensive (and they’re using underpaid interns) and accounts being too disposable (you can burn it all after desired effect is achieved)
At the direction of, and with financial support from, the GRU, CGE and its personnel used generative AI tools to quickly create disinformation that would be distributed across a massive network of websites designed to imitate legitimate news outlets to create false corroboration between the stories, as well as to obfuscate their Russian origin. CGE built a server that hosts the generative AI tools and associated AI-created content, in order to avoid foreign web-hosting services that would block their activity. The GRU provided CGE and a network of U.S.-based facilitators with financial support to: build and maintain its AI-support server; maintain a network of at least 100 websites used in its disinformation operations; and contribute to the rent cost of the apartment where the server is housed. Korovin played a key role in coordinating financial support from the GRU to his employees and U.S.-based facilitators.
even russian influence campaigns don’t do it this way. it’s probably a split between this kind of thing being too expensive (and they’re using underpaid interns) and accounts being too disposable (you can burn it all after desired effect is achieved)
they literally did actually
that’s news not socials, but we are seeing LLMs deployed by social media bot networks