The Chairman noted that the companies formally created an organization to fight online terrorism in June 2017 and touted its success in purging ISIS and al-Qaeda content, but struggled to keep up with the dissemination of the Christchurch video. People have "largely been kept in the dark" about the sites' success in fighting other kinds of online extremism, according to Thompson. He considered it a problem that Facebook wasn't aware of the video until informed by New Zealand police, and that YouTube hadn't addressed the "systemic flaws" that let the material spread.
If companies didn't focus on responding to these videos, Congress ought to "consider policies" that would prevent the distribution of terrorist content, including measures that might replicate what you've seen from other countries.
Facebook has confirmed to Engadget that it will brief the committee "soon." We've asked the other companies for comment as well. Whatever happens, it's safe to say that companies will argue that tit's a challenge to keep the shooting video off their sites. They'd be right to a degree (it's not hard to edit a video to bypass filtering), but that won't necessarily satisfy committee members worried that the video is still comparatively easy to find.