Wikipedia parser that generates offline content embeddable into Organic Maps map mwm files
Find a file
Evan Lloyd New-Schmidt f12e8d802c Add id parsing benchmarks
Initial results:

    running 4 tests
    test hash_wikidata   ... bench:          14 ns/iter (+/- 0)
    test hash_wikipedia  ... bench:          34 ns/iter (+/- 1)
    test parse_wikidata  ... bench:          18 ns/iter (+/- 0)
    test parse_wikipedia ... bench:      60,682 ns/iter (+/- 83,376)

Based on these results and a flamegraph of loading the file, url parsing
is the most expensive part.

Signed-off-by: Evan Lloyd New-Schmidt <evan@new-schmidt.com>
2023-06-23 15:50:04 -04:00
.github/workflows Setup GitHub (#2) 2023-06-01 09:25:35 +02:00
benches Add id parsing benchmarks 2023-06-23 15:50:04 -04:00
src Initial parsing and processing 2023-06-23 15:50:04 -04:00
.gitignore Initial rust setup (#1) 2023-05-30 19:00:05 +02:00
Cargo.lock Initial parsing and processing 2023-06-23 15:50:04 -04:00
Cargo.toml Initial parsing and processing 2023-06-23 15:50:04 -04:00
LICENSE Initial commit 2023-05-30 16:01:35 +02:00
README.md Initial rust setup (#1) 2023-05-30 19:00:05 +02:00

wikiparser

Extracts articles from Wikipedia database dumps for embedding into the mwm map files created by the Organic Maps generator.