Datasets:

Modalities:
Image
Text
Formats:
parquet
Languages:
Danish
ArXiv:
DOI:
Libraries:
Datasets
Dask
License:
kris927b commited on
Commit
077726d
·
1 Parent(s): 0bd2b88

Adding explanation for the use of `wtf_wikipedia` as parser.

Browse files
data/wikibooks/wikibooks.md CHANGED
@@ -85,6 +85,7 @@ To run the `create.py` file you first need to do:
85
  $ cd parser/ && npm install && cd ..
86
  ```
87
 
 
88
 
89
  ## Additional Information
90
 
 
85
  $ cd parser/ && npm install && cd ..
86
  ```
87
 
88
+ We chose to use `wtf_wikipedia` because out of the other parsers we tested this was the imperically best one. We tested `mwparserfromhell`, `mediawiki_dump`, `wikiextractor`, and `wtf_wikipedia`. It seemed that the others still produced some sort of artifacts from the parsing of wikicode.
89
 
90
  ## Additional Information
91
 
data/wikipedia/wikipedia.md CHANGED
@@ -88,6 +88,7 @@ To run the `create.py` file you first need to do:
88
  $ cd parser/ && npm install && cd ..
89
  ```
90
 
 
91
 
92
  ## Additional Information
93
 
 
88
  $ cd parser/ && npm install && cd ..
89
  ```
90
 
91
+ We chose to use `wtf_wikipedia` because out of the other parsers we tested this was the imperically best one. We tested `mwparserfromhell`, `mediawiki_dump`, `wikiextractor`, and `wtf_wikipedia`. It seemed that the others still produced some sort of artifacts from the parsing of wikicode.
92
 
93
  ## Additional Information
94
 
data/wikisource/wikisource.md CHANGED
@@ -84,6 +84,8 @@ To run the `create.py` file you first need to do:
84
  $ cd parser/ && npm install && cd ..
85
  ```
86
 
 
 
87
  ## Additional Information
88
 
89
 
 
84
  $ cd parser/ && npm install && cd ..
85
  ```
86
 
87
+ We chose to use `wtf_wikipedia` because out of the other parsers we tested this was the imperically best one. We tested `mwparserfromhell`, `mediawiki_dump`, `wikiextractor`, and `wtf_wikipedia`. It seemed that the others still produced some sort of artifacts from the parsing of wikicode.
88
+
89
  ## Additional Information
90
 
91