Translations

Complete manual/292/en

From Gender and Tech Resources

If you are considering creating and using bots to work for you as you fight online bigotry and harassment, there are some things you need to watch out for. Twitter is not against bots, and if you just want to create a bot that scans information from Twitter for you to analyse, or a bot that just tweets out to no one in particular, you will probably not encounter any problems. However, if you want to tweet at other Twitter users, you have to take into account Twitter's current policy against spam. Also keep in mind that language is very complicated and ‘slippery’, so if you want to tackle violence against women and trans* persons online (for example), you will have to be very careful about what kind of language you search for. Every time someone uses the word ‘bitch’ on Twitter to intimidate or harass someone in a negative way, there are probably at least five other people using it to tell their friend how much they love them in a positive way for instance. The best method to figure out how language is being used negatively to cause harm is to crowdsource it from people who have been harassed, and then experiment pulling tweets from Twitter using data-gathering bots and analyse the results yourself. Continuing reading more of this section to learn how to set up Twitter accounts to act as bots for you and your activism.