Facebook Solution To Spot Harmful Behavior of People

Facebook Solution To Spot Harmful Behavior of People
Published on

Facebook has created a machine learning solution to train bots to realistically simulate the behaviour of real people on a social media platform, a move that will improve software testing for complex environments particularly in product areas related to safety, security and privacy.

For large-scale social networks, testing a proposed code update or new feature is a complex and challenging task.

Follow us on Twitter to get daily news updates from us!!

According to Mark Harman, Research Scientist at Facebook AI, people's behaviour evolves and adapts over time and is different from one geography to the next, which makes it difficult to anticipate all the ways an individual or an entire community might respond to even a small change in their environment.

Facebook researchers have now developed Web-Enabled Simulation (WES) to overcome this problem.

"WES is a new method for building the first highly realistic, large-scale simulations of complex social networks," Harman wrote in a blog post.

Bots are trained to interact with each other using the same infrastructure as real users, so they can send messages to other bots, comment on bots' posts or publish their own, or make friend requests to other bots.

For large-scale social networks, testing a proposed code update or new feature is a complex and challenging task. Pixabay

Bots cannot engage with real users and their behaviour cannot have any impact on real users or their experiences on the platform.

WES is able to automate interactions between thousands or even millions of bots.

"We are using a combination of online and offline simulation, training bots with anything from simple rules and supervised machine learning to more sophisticated reinforcement learning," said Harman.

WES deploys these bots on the platform's actual production codebase.

The bots can interact with one another but are isolated from real users.

This real-infrastructure simulation ensures that the bots' actions are faithful to the effects that would be witnessed by real people using the platform.

"With WES, we are also developing the ability to answer counterfactual and what-if questions with scalability, realism, and experimental control," said Facebook.

The company has used WES to build WW, a simulated Facebook environment using the platform's actual production codebase. (IANS)

logo
NewsGram
www.newsgram.com