Facebook hopes to write the security 'playbook' for others to follow



It was only back in March that Facebook's Chief Security Officer, Alex Stamos, was rumored to be leaving the company after reportedly clashing with other execs over its disclosure of Russia's meddling in the 2016 US presidential election. But, that clearly never happened (though he did say his role "changed"), and Facebook has been dealing with a lot more problems than just election interference since then -- like ensuring that it keeps people's personal data safe. Today, Stamos took the stage at the F8 developers conference to talk about Facebook's efforts in security and how it plans to address the many issues it faces now and others that may arise in the future.


Stamos said on stage, during a keynote dubbed "Security at Facebook Scale," that the company has a responsibility to the world to build products that bring people closer together, but at the same time keep them safe from abuse. He added that one of the biggest challenges Facebook has had to face recently is knowing that, even though its tools are designed to create positive connections between people, building them comes with a risk. And that's something the company has had to learn the hard way in the past couple of years, after what happened with the 2016 election in the US and, most recently, the Cambridge Analytica data misuse nightmare.

"Protecting people's data is extremely important," Stamos said. "But doing that is not enough. I personally had to adjust to understand that security is more than building systems, it's understanding how tech can be abused to cause harm." He said that while Facebook "can build technically perfect products, there are still bad things that could happen," and it needs to take that into consideration with anything it makes going forward. "When you think about AI you have to think about risks in how you're training it," Stamos said. "With VR, it's not just the great fun things, but what is safety in a VR world? How are we gonna build that at scale when there are not any examples to go off?"

Stamos said that Facebook's goal isn't just to figure out the answers to these questions on its own, by investing heavily in new technologies and hiring more people to filter out bad content, but also to work together with academics and even other tech companies. "When we do this work, we also have to build relationships around the world. We have a lot of work to do to understand our responsibility," he said. "Theres no magic solution to these problems, but we're also not going to allow these challenges to paralyze us. We're going to build the playbook for companies to follow us, build at scale and mitigate those risks."

Source: Engadget

Share this

Related Posts

Previous
Next Post »