Facebook-Shadow-Social-Network-RightClick.AI

But Researchers warn about the dangers of simulation leading to bots’ unexpected interactions with real users and hence suggest to isolate the bots from real users.

Facebook is developing a shadow social network only for the bots to understand how scammers and trolls operate and exploit its platform.

The social media giant has deployed Web-enabled Simulation (WES) for the shadow social network which according to a research paper explains how Artificial Intelligence sims can mimic behavior of humans.

Through the bots’ interactions, researchers hope to learn how scammers exploit the network to scam others or exploit their info.

The simulation allows the bots to like posts, send friend requests, and other actions that a regular Facebook user can perform.

Each bot is designed to simulate different types of personalities using Facebook, some are designed to seek out targets, while others will have the traits that make them prone to scams.

The shadow social network simulates real user interactions and social behavior seen on the real platform. As per the research paper, the WES system, unlike the traditional simulation, is built on the real-world software platform instead of simulating the model of reality.

The WES promises to be realistic and actionable and its on-platform simulation of complex community interactions could be of use to understand and automatically improve multi-user system deployments.

Facebook researchers building the simulation feel that it would help them detect bugs within the world’s largest social network that is being currently used by about 2.5 billion users worldwide.

The simulation can be used to run thousands of various scenarios simultaneously which in turn helps to automatically recommend changes and updates that can improve real users’ experience.

As it is only lines of computer codes that separate real Facebook users from AI bots, researchers fear about the experiment risks getting spilled over into the public version of Facebook.

Although executed on real platform code, bots have to be ably isolated from the real users so that the simulation doesn’t lead to unexpected interactions between real users and bots.

Apart from isolation, bots will have to exhibit high-end user realism in some applications which could pose challenges for the Machine Learning approaches that are used to train them.

Leave a Reply