Skip to content

kshuraj/Toxic-comments

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Toxic-comments

The inspiration for this toxicity classification challenge comes from the idea of using ML to have better online conversations. Unfortunately, many online discussions devolve into acrimonious arguments, or outright harassment. If conversations are so bad that people leave the discussion, then we have clearly failed to have a online discussion, let alone a good one! This was the basis for working with Wikimedia to create a dataset of comments from Wikipedia Talk pages that have been crowd-evaluated for toxicity (rude, disrespectful, or otherwise likely to make people leave the discussion), as well as the type of toxicity that is present in the comment.

About

Kaggle competion

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published