A review of laws surrounding the “significantly increasing” problem of online abuse is urgently needed, former culture secretary Maria Miller says.
Conservative MP Ms Miller, chair of the Commons women and equalities committee, said police found it “incredibly difficult” to make current laws work.
She added it was time to get tough on social media networks too, which treat online space as the “Wild West”.
The national digital policing lead said responses to victims were inconsistent.
Chief Constable Stephen Kavanagh, of Essex Police, said more officer training was needed across the board.
Mr Kavanagh said there had been “an explosion” of different types of online crime – trolling, racial homophobic abuse, sexting, revenge pornography – which were not even imaginable when he became a PC in 1985.
Yet, he said, the police were working with 30 different pieces of legislation, including the Computer Misuse Act, which is 26 years old, and another dating back to 1861.
“It’s not really helping investigators, the Crown Prosecution Service or victims to bring these people to justice,” he told BBC Radio 4’s Today programme.
Nicola Brookes’ story: Branded a prostitute, drug dealer and paedophile
My ordeal started in 2011. I was singled out for commenting on a Facebook page for an X Factor contestant. The abuse escalated very, very quickly, which included a fake paedophile profile made of me. They spread and shared my profile photo and name all over Facebook pages, saying I was a prostitute, a drug dealer, a paedophile. Obviously the other users were reacting to this.
The report system to Facebook did not work. My family, friends and I constantly were reporting escalating abuse to Facebook. After about four days, I realised I needed expert help so I contacted the police and a law firm. I was told to print out all the screenshots, which I did.
I took over 200 screenshots into my local police. It was awful. I was in there less than 15 minutes. They would not look at the evidence. They said because it happened on Facebook, it was not a police matter, no crime had been committed. And they told me to close down my Facebook account.
Mr Kavanagh said victims turning up at a police front counter or ringing the station were getting inconsistent responses because of the variety of legislation, and because of how quickly things were moving. This, he said, was undermining victims’ confidence.
Ms Miller said a review of online abuse law was vital – similar to what had been done on revenge pornography a year ago.
“As a result of a campaign that I was leading, we’ve put in place a new law that partly recognises the posting of sexual images online in order to cause distress. If we can have a set of laws which pick up on online crime, that will enable the police to do more.”
She added that a clearer framework for operators was also needed because “at the moment, they are allowing criminal activity, whether on Facebook or Twitter or any other platforms, to go unchecked”.
She called for them to enable people to report offences “more readily” and ensure action was taken.
“There have to be consequences and, at the moment, there aren’t,” she said.
A spokesman for Facebook said the company worked with safety experts and took feedback from users to combat the “small minority of people… intent on harassing others online”.
New tools were being developed including an impersonation alert which “flags if another account is pretending to be you”, he said.
Facebook has said no government, charity, parent or company could make the internet a “safe” place, but they could work together to “educate and empower” people.
Kira O’Connor, from Twitter, said content in violation of its rules would be removed and offending accounts could be permanently suspended.
But she said tech companies “cannot simply delete prejudicial views from society”.
“Intolerance, in all its forms, is a deeply rooted societal problem. That’s why we also focus on education and building strong partnerships with the organisations that are tackling these issues head on,” she added.
Facebook and Google are currently hosting the first joint EU child safety summit in Dublin.
Charlotte Holloway, head of policy at techUK which represents 900 UK technology companies, said the industry took its responsibilities to keep users safe and secure online “very seriously”.
Existing legislation was “fit for purpose”, she said, and the focus should be on ensuring the police and prosecutors have the right skills and resources to bring perpetrators to justice.
The call for a review of the law came to light in a Guardian newspaper series considering The web we want, which explored the darker side of online comments and efforts to bring about better conversations online.