Twitter has recently rolled out a new feature that may help to combat this.
The recent social media boycott by the Football Association and English football teams has helped to highlight the sustained discriminatory and offensive abuse that many players and others high profile figures receive. However the problem extends far beyond this, and many people - including vulnerable individuals and children - are often targeted, as well as companies, their employees and other stakeholders.
Civil legal options available to victims largely apply once offensive content has been posted. Depending on the content, it may be possible to bring claims in:
However, these claims are not always straightforward particularly if, for example, the perpetrator lives outside of the jurisdiction. In addition, a lot of offensive content is posted online anonymously and, whilst it can be possible to obtain Court orders to help uncover the identities of anonymous trolls, this inevitably increases the expense.
Whilst campaigns to change and improve the law continue, social media organisations have often been criticised for failing to do enough themselves to help tackle online bullying.
However, after around a year of testing with certain users, Twitter has recently started rolling out a new feature that may help to reduce the amount of online abuse. If its algorithms consider that a prospective reply to a tweet could potentially be harmful or offensive then it will encourages users to think twice before posting, by first asking if they want to review the reply before publishing it and giving them options to edit, delete or send it anyway.
Twitter has said that during its testing it learned that, if prompted:
Clearly this will not discourage all users intent on posting offensive content, but it is hoped that by giving users the opportunity to pause and think before posting a reply, they will think twice before posting something harmful.