Wikipedia parser that generates offline content embeddable into Organic Maps map mwm files
Initial results: running 4 tests test hash_wikidata ... bench: 14 ns/iter (+/- 0) test hash_wikipedia ... bench: 34 ns/iter (+/- 1) test parse_wikidata ... bench: 18 ns/iter (+/- 0) test parse_wikipedia ... bench: 60,682 ns/iter (+/- 83,376) Based on these results and a flamegraph of loading the file, url parsing is the most expensive part. Signed-off-by: Evan Lloyd New-Schmidt <evan@new-schmidt.com> |
||
---|---|---|
.github/workflows | ||
benches | ||
src | ||
.gitignore | ||
Cargo.lock | ||
Cargo.toml | ||
LICENSE | ||
README.md |
wikiparser
Extracts articles from Wikipedia database dumps for embedding into the mwm
map files created by the Organic Maps generator.