Compare commits

...

164 Commits
4.1.x ... 4.2.x

Author SHA1 Message Date
Sarah Hoffmann
6c03099372 prepare release 4.2.4 2023-11-17 16:31:05 +01:00
Sarah Hoffmann
9f11be4c6a CI: completely remove ubuntu 18 2023-11-17 16:19:55 +01:00
Sarah Hoffmann
6d4da5123c CI: remove Ubuntu 18, no longer available on Actions 2023-11-17 16:13:36 +01:00
Sarah Hoffmann
037042f85b fix parameter use for ST_Project
Before postgis 3.4 ST_Project required a geography as input and seemed
to have implicitly converted to geography. Since 3.4 geometry input
is supported but leads to a completely different result.
2023-11-17 14:28:52 +01:00
Sarah Hoffmann
1da2192fb0 adapt to newest version of mypy 2023-11-17 10:17:25 +01:00
Sarah Hoffmann
35a5424332 improve code to collect the PostGIS version
The SQL contained an unchecked string literal, which may in theory be
used to attack the database.
2023-11-17 10:12:34 +01:00
Sarah Hoffmann
1187d0ab9a prepare 4.2.3 release 2023-04-11 15:35:42 +02:00
Sarah Hoffmann
ffe32af531 fix a number of corner cases with interpolation splitting
Snapping a line to a point before splitting was meant to ensure
that the split point is really on the line. However, ST_Snap() does
not always behave well for this case. It may shorten the interpolation
line in some cases with the result that two points housenumbers
suddenly fall on the same point. It might also shorten the line down
to a single point which then makes ST_Split() crash.

Switch to a combination of ST_LineLocatePoint and ST_LineSubString
instead, which guarantees to keep the original geometry. Explicitly
handle the corner cases, where the split point falls on the beginning
or end of the line.
2023-04-11 15:29:42 +02:00
Sarah Hoffmann
5baa827b8a use place_to_be_deleted when force deleting objects 2023-04-11 15:29:26 +02:00
Sarah Hoffmann
3a3475acce flex style: reinstate postcode boundaries
Postcode boundaries don't have a name, so need to be imported
unconditionally.
2023-04-11 15:28:37 +02:00
Sarah Hoffmann
b17cdb5740 call osm2pgsql postprocessing flush_deleted_places() when adding data 2023-04-11 15:28:17 +02:00
Sarah Hoffmann
069f3f5dea prepare release 4.2.2 2023-03-22 18:16:01 +01:00
Sarah Hoffmann
18f912b29f actions: restrict linting to newest version 2023-03-22 17:31:51 +01:00
Sarah Hoffmann
35e7e52501 adapt to new version of pylint 2023-03-22 16:00:53 +01:00
Sarah Hoffmann
067719481f remove more tags from full style
The full style should only save the necessary tags needed for
processing.
2023-03-22 15:18:59 +01:00
Sarah Hoffmann
8b6540c989 fix handling of unused extra tags
The tags can only be moved to extra tags after the main tags have been
handled.
2023-03-22 11:48:31 +01:00
Sarah Hoffmann
325392310f fix polygon simplification in reverse results
polygon_threshold has never really worked for reverse.
2023-03-22 11:46:41 +01:00
Sarah Hoffmann
0265d6dafc restrict place rank inheritance to address items
Place tags must have no influence on street- or POI-level
objects.
2023-03-22 11:44:02 +01:00
Sarah Hoffmann
637ef30af1 actions: use token to avoid rate limiting 2023-03-22 11:41:32 +01:00
danil
45c184d45b Main tag information added to geocodejson in reverse geocoding 2023-03-22 11:40:31 +01:00
Sarah Hoffmann
28770146f9 actions: force PHPUnit 9
PHPUnit 10 is incompatible with our tests. Not worth adapting anymore.
2023-03-22 11:39:55 +01:00
Sarah Hoffmann
a9444a06c5 docs: fix internal links
Fixes #2968.
2023-03-22 11:38:54 +01:00
Sarah Hoffmann
d756e5f0e5 fix importance recalculation
The signature of the compute_importance() function has changed.
2023-03-22 11:37:07 +01:00
Sarah Hoffmann
fabe45f60a remove comma as name separator
Commas are most of the time used as a part of a name, not to
separate multiple names.

See also #2950.
2023-03-22 11:36:51 +01:00
Sarah Hoffmann
1de8bdaafe exclude names ending in :wikipedia from indexing
The wikipedia prefix is used for referencing a wikipedia article
for the given tag, not the object, so not useful to search.
2023-03-22 10:56:34 +01:00
Sarah Hoffmann
000a70639f fix typo in argument to details CLI command
Fixes #2951.
2023-03-22 10:56:02 +01:00
Sarah Hoffmann
6eadf6797e update Makefile in test directory 2023-03-22 10:55:35 +01:00
Sarah Hoffmann
40b061afd2 do not run osm2pgsql append with mutliple threads
As the updates modify the placex table, there may be deadlocks
when different objects want to forward modifications to the same
place (for example because they are both linked to it).
2023-03-22 10:53:35 +01:00
Sarah Hoffmann
eb3a6aa509 split query that deletes old objects from placex
placex only has partial indexes over OSM types, so the OSM type
needs to be hardcoded to ensure these indexes are used.
2023-03-22 10:51:56 +01:00
Sarah Hoffmann
9f7e6da971 minor adaptions for flex style 2023-03-22 10:50:08 +01:00
marc tobias
3729bdde7d VAGRANT.md - replace local.php settings with .env 2023-03-22 10:48:42 +01:00
Sarah Hoffmann
f8df574b78 use canonical url for nominatim.org 2023-03-22 10:46:15 +01:00
Sarah Hoffmann
51f3485874 install new lua import scripts 2023-03-22 10:45:11 +01:00
Sarah Hoffmann
a0e107d57f flez: add other default styles 2023-03-22 10:43:20 +01:00
Sarah Hoffmann
b6ae3f3f09 flex: hide compiled matchers 2023-03-22 10:42:38 +01:00
Sarah Hoffmann
4f1ddcd521 flex: switch to functions for substyles
This gives us a bit more flexibility about the implementation
in the future.
2023-03-22 10:42:09 +01:00
Sarah Hoffmann
34d629f677 explicit export for functions in flex-base 2023-03-22 10:41:51 +01:00
Sarah Hoffmann
bb613a1d85 flex: add combining clean function 2023-03-22 10:41:22 +01:00
Sarah Hoffmann
2fe0e0629a flex: simplify name handling 2023-03-22 10:41:12 +01:00
Sarah Hoffmann
a0e4e123b1 flex: simplify address configuration 2023-03-22 10:40:59 +01:00
Sarah Hoffmann
92abae7850 update osm2pgsql (flex not building index) 2023-03-22 10:40:01 +01:00
Sarah Hoffmann
6fe3dc63f5 use grapheme_stripos instead of stripos in PHP code
The stripos() does not handle non-ASCII correctly.
2023-03-22 10:36:15 +01:00
Sarah Hoffmann
e2dcc9ebf8 do not assign postcodes to long linear features
This avoids a postcode in particular for waterway features and
long natural featues like ridges and valleys.

Fixes #2915.
2023-03-22 10:35:13 +01:00
Frederik Ramm
9b233362c6 Fix typo in NOMINATIM_LOG_FILE (#2919)
* fix typo in docs (NOMINATIM_LOG_FILE uses s not ms)
2023-03-22 10:33:59 +01:00
Sarah Hoffmann
a727624b9e add FAQ about finding bad postcodes 2023-03-22 10:33:22 +01:00
Sarah Hoffmann
3313369a39 contract duplicate spaces in transliteration string
There are some pathological cases where an isolated letter may
be deleted because it is in itself meaningless. If this happens in
the middle of a sentence, then the transliteration contains two
consecutive spaces. Add a final rule to fix this.

See #2909.
2023-03-22 10:14:15 +01:00
Sarah Hoffmann
7d140970b7 prepare release 4.2.1 2023-02-20 17:58:19 +01:00
Sarah Hoffmann
cfd631e99c harmonize flags for PHP's htmlspecialchars 2023-02-20 17:54:38 +01:00
Sarah Hoffmann
3d39847e26 adapt PHP tests for debug output 2023-02-20 17:53:50 +01:00
Sarah Hoffmann
a664beb810 properly encode special HTML characters in debug mode 2023-02-20 17:53:48 +01:00
Sarah Hoffmann
04ee39467a actions: install keys for postgres repo 2022-11-24 14:04:05 +01:00
Sarah Hoffmann
1f3edf6eba prepare release 4.2.0 2022-11-24 10:43:29 +01:00
Sarah Hoffmann
a15c197547 add checklist for releases 2022-11-24 10:43:25 +01:00
Sarah Hoffmann
13dbeb75c7 Merge pull request #2903 from lonvia/migration-for-index-reorganization
Add migration for reorganization of pending indexes
2022-11-24 10:13:38 +01:00
Sarah Hoffmann
6aded60045 add migration for reorganization of pending indexes
Fixes #2900.
2022-11-24 08:48:05 +01:00
Sarah Hoffmann
8dfdf64dd5 Merge pull request #2902 from lonvia/tiger-county-sanitizer
Tiger county sanitizer
2022-11-23 17:58:42 +01:00
Sarah Hoffmann
41e8bddaa9 remove BDD test for tiger:county
We no longer rely on the import to strip the tag.
2022-11-23 10:37:27 +01:00
Sarah Hoffmann
fd3dec8efe add sanitizer for TIGER tags
Currently only takes over cleaning the tiger:county data. This was
done by the import until now.
2022-11-23 10:37:27 +01:00
Sarah Hoffmann
55ee08f42b Merge pull request #2892 from lonvia/optional-forward-dependecies
Add experimental configuration switch for enabling forward dependencies
2022-11-21 16:57:45 +01:00
Sarah Hoffmann
b6ff697ff0 add experimental option for enabling forward dependencies 2022-11-21 14:48:00 +01:00
Sarah Hoffmann
925ac1e1b9 Merge pull request #2890 from lonvia/use-rank-search-for-reverse-polygon-match
Use rank search for reverse polygon match
2022-11-20 22:11:35 +01:00
Sarah Hoffmann
77acc1c2be force use of geometry index for reverse polygon lookup 2022-11-20 20:22:44 +01:00
Sarah Hoffmann
ebe489c227 use rank_search for reverse polygon match 2022-11-20 20:22:23 +01:00
Sarah Hoffmann
9c152a030a fix condition under which place_to_be_deleted is created
It is needed for updates, independently if reverse-only is set.
2022-11-19 21:53:14 +01:00
Sarah Hoffmann
b310c86c55 Merge pull request #2889 from lonvia/fix-interpolation-updates
Drop illegal values for addr:interpolation on update
2022-11-18 18:51:11 +01:00
Sarah Hoffmann
c9ff7d2130 drop illegal values for addr:interpolation on update 2022-11-18 17:26:56 +01:00
Sarah Hoffmann
52456230cc Merge pull request #2887 from lonvia/lookup-linked-places
Add support for lookup of linked places
2022-11-17 13:35:53 +01:00
Sarah Hoffmann
4422533adb Merge pull request #2886 from lonvia/closest-street-in-associated
Handle associatedStreet relations with multiple streets correctly
2022-11-17 07:29:25 +01:00
Sarah Hoffmann
c4b13f2b7f add support for lookup of linked places 2022-11-16 21:34:45 +01:00
Sarah Hoffmann
4f05a03d13 handle associatedStreet relations with multiple streets
When a associatedStreet relation has multiple street members
always take the closest one. Avoid geometry operations for
the frequent case that there is only one street.
2022-11-16 17:25:51 +01:00
Sarah Hoffmann
7a2e586cce Merge pull request #2884 from lonvia/tweak-special-term-penalties
Correctly handle special term + name combination
2022-11-15 19:29:55 +01:00
Sarah Hoffmann
98ce424650 Merge pull request #2885 from lonvia/remove-unused-countries
Remove dependent territories from country list
2022-11-15 19:29:39 +01:00
Sarah Hoffmann
3059a3da4e correctly handle special term + name combination
Special terms with operator name usually appear in combination with the
name. The current penalties only took name + special term into account
not special term + name.

Fixes #2876.
2022-11-15 11:55:40 +01:00
Sarah Hoffmann
d63d7cb9a8 remove dependent territories from country list
Removes territories of US, France, Australia and Netherlands from the
country list. These territories have their own country code (which is
why they are in the list in the first place) but are mapped as part of
the admin_level 2 relations for the respective parent countries.
Therefore they never had any places attached. In practical terms, the
change only affects the number of tables created.
2022-11-15 11:37:30 +01:00
Sarah Hoffmann
f3f542e864 Merge pull request #2881 from lonvia/more-update-tests-for-osm2pgsql
Experimental support for osm2pgsql flex output
2022-11-15 09:39:46 +01:00
Sarah Hoffmann
93ada250f7 bdd: add tests for osm2pgsql update of postcode nodes 2022-11-14 17:27:04 +01:00
Sarah Hoffmann
d8e3ba3b54 bdd: add osm2pgsql tests for updating interpolations 2022-11-14 16:57:31 +01:00
Sarah Hoffmann
a46348da38 bdd: test placex content when updating with osm2pgsql 2022-11-14 14:48:44 +01:00
Sarah Hoffmann
36cf0eb922 reorganize handling of place type changes
Always replace existing entries in place, never delete them because
a direct delete will cause conflicts.
2022-11-14 13:57:26 +01:00
Sarah Hoffmann
63a9bc94f7 fix country handling in flex style
If the country tag does not match a 2-letter code, it needs to
be dropped.
2022-11-10 15:52:13 +01:00
Sarah Hoffmann
2dafc4cf4f remove tests that differ between lua and gazetteer versions 2022-11-10 15:51:55 +01:00
Sarah Hoffmann
68d09f9cad node locations must be stable for osm2pgsql update tests 2022-11-10 11:11:45 +01:00
Sarah Hoffmann
b98d3d3f00 bdd: extend osm2pgsql update tests
Now also checks for correct indexing state of placex table.
2022-11-10 09:38:25 +01:00
Sarah Hoffmann
3683cf7ddc optimise tag match function 2022-11-10 09:38:25 +01:00
Sarah Hoffmann
84e5e601e1 add lua requirements for vagrant scripts 2022-11-10 09:38:25 +01:00
Sarah Hoffmann
a1da149211 CI: require lua libraries 2022-11-10 09:38:25 +01:00
Sarah Hoffmann
74405e9684 add migration for place_to_be_deleted table 2022-11-10 09:38:25 +01:00
Sarah Hoffmann
2fac507453 change updates to handle delete/insert workflow
This makes Nominatim compatible with osm2pgsql's default update
modus operandi of deleting and reinserting data. Deletes are diverted
into a TODO table instead of executing them. When data is reinserted,
the corresponding entry in the TODO table is deleted. After updates are
finished, the remaining entries in the TODO table are executed, doing
the same work as the delete trigger did before.

The new behaviour also works against the gazetteer output with its
insert-only mechanism.
2022-11-10 09:38:23 +01:00
Sarah Hoffmann
51ed55cc32 initial flex import scripts
Only implements the extratags style for the moment. Tests pass
for the same behaviour as the gazetteer output. Updates still need
to be done.
2022-11-10 09:37:38 +01:00
Sarah Hoffmann
de2a3bd5f8 bdd tests: make import style configurable
The switch is for development. Tests are not guaranteed to still
work when run with anything but the 'extratags' style.
2022-11-10 09:37:38 +01:00
Sarah Hoffmann
981e9700be add osm2pgsql gazetteer tests
This ports the gazetteer tests from osm2pgsql to BDD tests.
2022-11-10 09:37:38 +01:00
Sarah Hoffmann
b52ce4f9f2 Merge pull request #2869 from mtmail/improve-tiger-install-doc
Tiger install doc: add -refresh website- step
2022-11-09 20:48:39 +01:00
Sarah Hoffmann
64c591da7f fix type issues with calls to pyosmium 2022-11-09 20:46:33 +01:00
Marc Tobias
2387648a85 Tiger install doc: add -refresh website- step 2022-11-09 17:33:31 +01:00
Sarah Hoffmann
846ecff0c5 Merge pull request #2871 from lonvia/fix-timeout-for-updates
Fix timeout for updates
2022-11-09 14:26:39 +01:00
Sarah Hoffmann
26a5b59c28 add types-requests dependency 2022-11-09 09:12:37 +01:00
Sarah Hoffmann
6ddb39fda3 respect socket timeout also in other replication functions 2022-11-09 09:12:37 +01:00
Sarah Hoffmann
1fdcec985a fix timeout use for replication timeout
The timeout parameter is no longer taken into account since
pyosmium switched to the requests library. This adds the parameter
back.
2022-11-09 09:12:37 +01:00
Sarah Hoffmann
30f526c943 Merge pull request #2870 from mtmail/update-github-actions-to-node-16
update those github action packages still using node12
2022-11-08 17:24:53 +01:00
Marc Tobias
253127cb9f update those github action packages still using node12 2022-11-08 15:16:55 +01:00
Sarah Hoffmann
3237ca587f Merge pull request #2866 from lonvia/reverse-ignore-interpolations-without-parent
Ignore interpolations without parent on reverse search
2022-11-07 09:00:59 +01:00
Sarah Hoffmann
0dbc0ae6d5 ignore interpolations without parent on reverse search
If no parent can be found for an interpolation, there is most
likely a data error involved. So don' t show these interpolations
in reverse search results.
2022-11-05 22:16:09 +01:00
Sarah Hoffmann
7461ff4680 Merge pull request #2865 from Romeo-PHILLIPS/fix/documentation_status_code
Fix: documentation status code
2022-11-05 22:14:44 +01:00
Romeo
afc714e1d3 fix: format 2022-11-04 18:05:40 +01:00
Romeo
3bc0db8d91 fix: markup 2022-11-04 18:04:28 +01:00
Romeo
d573da5b2c fix: 705 Status Code Documentation 2022-11-04 18:03:49 +01:00
Romeo
ecd5a3fdf9 fix: 705 Status Code Documenation 2022-11-04 17:59:36 +01:00
Sarah Hoffmann
543d63e7a9 Merge pull request #2862 from mtmail/remove-version-from-fpm-sock-file
Install scripts: remove version from /var/run/php-fpm filenames
2022-11-04 17:32:50 +01:00
Sarah Hoffmann
7a22ae6bf9 Merge pull request #2863 from lonvia/add-support-for-postgresql-15
Update CI tests to postgresql 15
2022-11-04 17:32:06 +01:00
Sarah Hoffmann
ebe23d6882 update CI tests to postgresql 15 2022-11-04 16:21:15 +01:00
marc tobias
33c805aee0 Install scripts: remove version from /var/run/php-fpm filenames 2022-11-04 14:22:11 +01:00
Sarah Hoffmann
616ff4ae25 actions: pin pyicu to 2.9 2022-10-24 14:21:44 +02:00
Sarah Hoffmann
e221eaa977 Merge pull request #2836 from mtmail/tiger2022
Documentation: remove year from TIGER filename, new 2022 data
2022-10-24 11:21:55 +02:00
Sarah Hoffmann
eed7abb839 Merge pull request #2838 from lonvia/update-osm2pgsql
Update osm2pgsql to latest 1.7.1 release
2022-10-05 18:59:13 +02:00
Sarah Hoffmann
5f6dcd36ed fix flaky API test
The search 'landstr' produces many duplicates so that with
some bad luck 4 or less results may appear. Disable deduplication
to make it more predictable.
2022-10-05 15:16:14 +02:00
Sarah Hoffmann
f395054536 update osm2pgsql to 1.7.1 2022-10-04 21:16:57 +02:00
Sarah Hoffmann
afeafc8aa7 Merge pull request #2835 from lonvia/secondary-importance
Secondary importance
2022-10-04 16:25:47 +02:00
marc tobias
f1ece658f8 Documentation: remove year from TIGER filename 2022-10-04 14:19:36 +02:00
Sarah Hoffmann
b3abb355eb docs: add customization hints for secondary importance
Removing the download links for now as the tile importance
is still too experimental.
2022-10-01 11:01:49 +02:00
Sarah Hoffmann
5877b69d51 do not run unit test when postgis_raster is not available 2022-10-01 11:01:49 +02:00
Sarah Hoffmann
5ec2c1b712 adapt unit tests to changed function names 2022-10-01 11:01:49 +02:00
Sarah Hoffmann
0a73ed7d64 add secondary importance to API BDD tests
Also fixes a path issue during API test DB creation that could
never possibly have worked.
2022-10-01 11:01:49 +02:00
Sarah Hoffmann
abf349fb0d simplify use of secondary importance
The values in the raster are already normalized between 0 and 2**16,
so a simple conversion to [0, 1] will do.

Check for existance of secondary_importance table statically when
creating the SQL function. For that to work importance tables need
to be created before the functions.
2022-10-01 11:01:49 +02:00
Sarah Hoffmann
3185fad918 load views as a SQL file and rename to 'secondary importance'
The only requirement for secondary importance is that a raster table
comes out of it. The generic name leaves open where the data comes
from.
2022-10-01 11:01:49 +02:00
Tareq Al-Ahdal
0ab0f0ea44 Integrated OSM views into importance computation 2022-10-01 11:01:49 +02:00
Tareq Al-Ahdal
ac467c7a2d Enhanced the implementation of OSM views GeoTIFF import functionality 2022-10-01 11:01:49 +02:00
Tareq Al-Ahdal
c85b74497b Initial implementation of GeoTIFF import functionality 2022-10-01 11:01:49 +02:00
Sarah Hoffmann
3381a92d92 Merge pull request #2832 from lonvia/conditional-analyze-on-indexing
Only run analyze on indexing when work was done
2022-09-28 15:17:40 +02:00
Sarah Hoffmann
a2ee58d8a1 only run analyze on indexing when work was done
This speeds up processing when continuing indexing after it was
interrupted.
2022-09-28 10:22:54 +02:00
Sarah Hoffmann
051f3720ce Merge pull request #2829 from lonvia/optimize-indexes
Further optimize indexes
2022-09-26 10:02:51 +02:00
Sarah Hoffmann
f017e1e9a1 make sure indexes are used 2022-09-25 14:09:45 +02:00
Sarah Hoffmann
33ba6896a8 further split up the big geometry index
Adds partial indexes for all geometry queries used during import.
A full index is not necessary anymore at that point. Still create
the index afterwards for use in queries.

Also adds documentation for all indexes on where they are used.
2022-09-21 16:21:41 +02:00
Sarah Hoffmann
f4d3ae6f70 consolidate indexes over geometry_sectors
The index over geometry_sectors are mainly used for ordering
the places which need indexing. That means they function effectively
as a TODO list. Consolodate them so that they always only contain
the places which are still to do. Also add the appropriate index
for the boundary indexing phase.
2022-09-21 10:38:58 +02:00
Sarah Hoffmann
860f3559a1 split up large osmid index on placex
This doesn't do anything in terms of lookup speeds but the resulting
indexes are quite a bit smaller.
2022-09-21 09:24:57 +02:00
Sarah Hoffmann
d8be8a7293 fix funding link 2022-09-19 15:39:58 +02:00
Sarah Hoffmann
9750a361c9 add Github Sponsering to funding page 2022-09-19 15:38:56 +02:00
Sarah Hoffmann
ed3dd81d04 run final index creation in parallel 2022-09-19 11:55:25 +02:00
Sarah Hoffmann
bef1aebf1c add function for parallel execution of SQL scripts 2022-09-19 11:52:17 +02:00
Sarah Hoffmann
26688ba35d add link to funding page 2022-09-19 10:30:58 +02:00
Sarah Hoffmann
a1158feeb8 Merge pull request #2818 from lonvia/better-geometry-index
Add index for lookup of addressable areas
2022-09-19 10:18:43 +02:00
Sarah Hoffmann
aef014a47d add indexes for lookup of addressable areas
The generic geometry index has become to slow for that purpose.
2022-09-18 16:57:12 +02:00
Sarah Hoffmann
d6a0947e5a update security policy for 4.1 version 2022-09-13 08:58:31 +02:00
Sarah Hoffmann
bc94318d83 mypy: fix new warnings due to external type updates 2022-09-05 17:39:35 +02:00
Sarah Hoffmann
d4c6e58b57 Merge pull request #2812 from mausch/patch-1
docs: fix links to rank docs
2022-09-05 17:27:09 +02:00
Mauricio Scheffer
66832cf0a5 docs: fix links to rank docs 2022-09-05 11:11:13 +01:00
Sarah Hoffmann
bcfe817212 Merge pull request #2799 from lonvia/fix-inclusions-with-extratags
Ignore irrelevant extra tags on address interpolations
2022-08-13 19:02:27 +02:00
Sarah Hoffmann
07d72f950b Merge pull request #2739 from tareqpi/collect_os_info.sh
integration of host system information script into Nominatim CLI tool
2022-08-13 19:02:14 +02:00
Sarah Hoffmann
dddfa3a075 ignore irrelevant extra tags on address interpolations
When deciding if an address interpolation has address information, only
look for addr:street and addr:place. If they are not there go looking
for the address on the address nodes. Ignores irrelevant tags like
addr:inclusion.

Fixes #2797.
2022-08-13 14:07:06 +02:00
Tareq Al-Ahdal
74019877a4 Added the feature of collecting host system information to the CI tests 2022-08-13 06:22:13 +08:00
Tareq Al-Ahdal
465d82a92f Integrated 'collect_os_info.py' into Nominatim's CLI tool 2022-08-13 06:18:10 +08:00
Tareq Al-Ahdal
49f889bf09 Enhanced and refactored 'collect_os_info.py'
Changed the script to functional programming paradigm to remove the big number of local attributes to decrease memory usage when running it. Additional OS info are now included.
2022-08-13 06:13:05 +08:00
Tareq Al-Ahdal
5e477e3b5b Merge remote-tracking branch 'upstream/master' into collect_os_info.sh 2022-08-13 05:53:39 +08:00
Sarah Hoffmann
67cfad6a2c Merge pull request #2798 from lonvia/more-rank-change-fixes
Invalidations when boundaries and places change their rank
2022-08-12 11:42:03 +02:00
Sarah Hoffmann
487e81fe3c more invalidations when boundary changes rank
When a boundary or place changes its address rank, all places where
it participates as address need to be potentially reindexed.
Also use the computed rank when testing place nodes against
boundaries. Boundaries are computed earlier.

Fixes #2794.
2022-08-12 09:48:46 +02:00
Sarah Hoffmann
18f525ac54 Merge pull request #2793 from lonvia/increase-minimum-results
Fix minimum number of results that are searched for
2022-08-09 20:08:45 +02:00
Sarah Hoffmann
e0c184e097 fix base number of returned results
The intent was to always search for at least 10 results.

Improves on #882.
2022-08-09 13:53:20 +02:00
Sarah Hoffmann
78716ab8b9 Merge pull request #2792 from lonvia/new-type-annotations
Adapt to new type annotations from typeshed
2022-08-09 13:52:20 +02:00
Sarah Hoffmann
8d082c13e0 adapt to new type annotations from typeshed
Some more functions frrom psycopg are now properly annotated.
No ignoring necessary anymore.
2022-08-09 11:06:54 +02:00
Sarah Hoffmann
196dc2a659 docs: add types-psutil requirement 2022-08-08 09:46:25 +02:00
Sarah Hoffmann
4fe797d704 remove mypy ignore for psutil.virtual_memory()
Now available in typeshed.
2022-08-08 09:44:45 +02:00
Sarah Hoffmann
3c188164ab Merge pull request #2789 from lonvia/update-osm2pgsql
Update osm2pgsql (fixes admin_level parsing)
2022-08-08 09:15:58 +02:00
Sarah Hoffmann
5330370076 update osm2pgsql (fix admin_level parsing) 2022-08-07 18:34:47 +02:00
Micah David Cochran
8bda59fbe7 made collect_os_info script in Python 2022-01-03 14:57:01 -06:00
Micah David Cochran
f20d85738f add utils/collect_os_info.sh script 2021-12-13 11:26:09 -06:00
137 changed files with 3638 additions and 1359 deletions

2
.github/FUNDING.yml vendored Normal file
View File

@@ -0,0 +1,2 @@
github: lonvia
custom: "https://nominatim.org/funding/"

View File

@@ -9,6 +9,10 @@ inputs:
description: 'Additional options to hand to cmake'
required: false
default: ''
lua:
description: 'Version of Lua to use'
required: false
default: '5.3'
runs:
using: "composite"
@@ -21,9 +25,9 @@ runs:
shell: bash
- name: Install prerequisites
run: |
sudo apt-get install -y -qq libboost-system-dev libboost-filesystem-dev libexpat1-dev zlib1g-dev libbz2-dev libpq-dev libproj-dev libicu-dev
sudo apt-get install -y -qq libboost-system-dev libboost-filesystem-dev libexpat1-dev zlib1g-dev libbz2-dev libpq-dev libproj-dev libicu-dev liblua${LUA_VERSION}-dev lua${LUA_VERSION}
if [ "x$UBUNTUVER" == "x18" ]; then
pip3 install python-dotenv psycopg2==2.7.7 jinja2==2.8 psutil==5.4.2 pyicu osmium PyYAML==5.1 datrie
pip3 install python-dotenv psycopg2==2.7.7 jinja2==2.8 psutil==5.4.2 pyicu==2.9 osmium PyYAML==5.1 datrie
else
sudo apt-get install -y -qq python3-icu python3-datrie python3-pyosmium python3-jinja2 python3-psutil python3-psycopg2 python3-dotenv python3-yaml
fi
@@ -31,6 +35,7 @@ runs:
env:
UBUNTUVER: ${{ inputs.ubuntu }}
CMAKE_ARGS: ${{ inputs.cmake-args }}
LUA_VERSION: ${{ inputs.lua }}
- name: Configure
run: mkdir build && cd build && cmake $CMAKE_ARGS ../Nominatim

View File

@@ -15,7 +15,9 @@ runs:
- name: Remove existing PostgreSQL
run: |
sudo apt-get purge -yq postgresql*
sudo sh -c 'echo "deb http://apt.postgresql.org/pub/repos/apt $(lsb_release -cs)-pgdg main" > /etc/apt/sources.list.d/pgdg.list'
sudo apt install curl ca-certificates gnupg
curl https://www.postgresql.org/media/keys/ACCC4CF8.asc | gpg --dearmor | sudo tee /etc/apt/trusted.gpg.d/apt.postgresql.org.gpg >/dev/null
sudo sh -c 'echo "deb https://apt.postgresql.org/pub/repos/apt $(lsb_release -cs)-pgdg main" > /etc/apt/sources.list.d/pgdg.list'
sudo apt-get update -qq
shell: bash

View File

@@ -7,11 +7,11 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v3
with:
submodules: true
- uses: actions/cache@v2
- uses: actions/cache@v3
with:
path: |
data/country_osm_grid.sql.gz
@@ -27,7 +27,7 @@ jobs:
mv nominatim-src.tar.bz2 Nominatim
- name: 'Upload Artifact'
uses: actions/upload-artifact@v2
uses: actions/upload-artifact@v3
with:
name: full-source
path: nominatim-src.tar.bz2
@@ -37,20 +37,15 @@ jobs:
needs: create-archive
strategy:
matrix:
ubuntu: [18, 20, 22]
ubuntu: [20, 22]
include:
- ubuntu: 18
postgresql: 9.6
postgis: 2.5
pytest: pytest
php: 7.2
- ubuntu: 20
postgresql: 13
postgis: 3
pytest: py.test-3
php: 7.4
- ubuntu: 22
postgresql: 14
postgresql: 15
postgis: 3
pytest: py.test-3
php: 8.1
@@ -58,7 +53,7 @@ jobs:
runs-on: ubuntu-${{ matrix.ubuntu }}.04
steps:
- uses: actions/download-artifact@v2
- uses: actions/download-artifact@v3
with:
name: full-source
@@ -69,10 +64,12 @@ jobs:
uses: shivammathur/setup-php@v2
with:
php-version: ${{ matrix.php }}
tools: phpunit, phpcs, composer
tools: phpunit:9, phpcs, composer
ini-values: opcache.jit=disable
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- uses: actions/setup-python@v2
- uses: actions/setup-python@v4
with:
python-version: 3.6
if: matrix.ubuntu == 18
@@ -99,19 +96,23 @@ jobs:
if: matrix.ubuntu == 22
- name: Install latest pylint/mypy
run: pip3 install -U pylint mypy types-PyYAML types-jinja2 types-psycopg2 types-psutil typing-extensions
run: pip3 install -U pylint mypy types-PyYAML types-jinja2 types-psycopg2 types-psutil types-requests typing-extensions
if: matrix.ubuntu == 22
- name: PHP linting
run: phpcs --report-width=120 .
working-directory: Nominatim
if: matrix.ubuntu == 22
- name: Python linting
run: pylint nominatim
working-directory: Nominatim
if: matrix.ubuntu == 22
- name: Python static typechecking
run: mypy --strict nominatim
working-directory: Nominatim
if: matrix.ubuntu == 22
- name: PHP unit tests
@@ -136,7 +137,7 @@ jobs:
runs-on: ubuntu-20.04
steps:
- uses: actions/download-artifact@v2
- uses: actions/download-artifact@v3
with:
name: full-source
@@ -231,7 +232,7 @@ jobs:
OS: ${{ matrix.name }}
INSTALL_MODE: ${{ matrix.install_mode }}
- uses: actions/download-artifact@v2
- uses: actions/download-artifact@v3
with:
name: full-source
path: /home/nominatim
@@ -265,6 +266,10 @@ jobs:
run: nominatim --version
working-directory: /home/nominatim/nominatim-project
- name: Collect host OS information
run: nominatim admin --collect-os-info
working-directory: /home/nominatim/nominatim-project
- name: Import
run: nominatim import --osm-file ../test.pbf
working-directory: /home/nominatim/nominatim-project

View File

@@ -13,6 +13,6 @@ ignored-classes=NominatimArgs,closing
# 'too-many-ancestors' is triggered already by deriving from UserDict
# 'not-context-manager' disabled because it causes false positives once
# typed Python is enabled. See also https://github.com/PyCQA/pylint/issues/5273
disable=too-few-public-methods,duplicate-code,too-many-ancestors,bad-option-value,no-self-use,not-context-manager
disable=too-few-public-methods,duplicate-code,too-many-ancestors,bad-option-value,no-self-use,not-context-manager,use-dict-literal
good-names=i,x,y,fd,db,cc
good-names=i,x,y,m,fd,db,cc

View File

@@ -19,8 +19,8 @@ list(APPEND CMAKE_MODULE_PATH "${CMAKE_SOURCE_DIR}/cmake")
project(nominatim)
set(NOMINATIM_VERSION_MAJOR 4)
set(NOMINATIM_VERSION_MINOR 1)
set(NOMINATIM_VERSION_PATCH 0)
set(NOMINATIM_VERSION_MINOR 2)
set(NOMINATIM_VERSION_PATCH 4)
set(NOMINATIM_VERSION "${NOMINATIM_VERSION_MAJOR}.${NOMINATIM_VERSION_MINOR}.${NOMINATIM_VERSION_PATCH}")
@@ -63,7 +63,6 @@ if (BUILD_IMPORTER AND BUILD_OSM2PGSQL)
endif()
set(BUILD_TESTS_SAVED "${BUILD_TESTS}")
set(BUILD_TESTS off)
set(WITH_LUA off CACHE BOOL "")
add_subdirectory(osm2pgsql)
set(BUILD_TESTS ${BUILD_TESTS_SAVED})
endif()
@@ -270,6 +269,12 @@ install(FILES settings/env.defaults
settings/import-address.style
settings/import-full.style
settings/import-extratags.style
settings/import-admin.lua
settings/import-street.lua
settings/import-address.lua
settings/import-full.lua
settings/import-extratags.lua
settings/flex-base.lua
settings/icu_tokenizer.yaml
settings/country_settings.yaml
DESTINATION ${NOMINATIM_CONFIGDIR})

View File

@@ -64,3 +64,39 @@ Before submitting a pull request make sure that the tests pass:
cd build
make test
```
## Releases
Nominatim follows semantic versioning. Major releases are done for large changes
that require (or at least strongly recommend) a reimport of the databases.
Minor releases can usually be applied to exisiting databases Patch releases
contain bug fixes only and are released from a separate branch where the
relevant changes are cherry-picked from the master branch.
Checklist for releases:
* [ ] increase version in `nominatim/version.py` and CMakeLists.txt
* [ ] update `ChangeLog` (copy information from patch releases from release branch)
* [ ] complete `docs/admin/Migration.md`
* [ ] update EOL dates in `SECURITY.md`
* [ ] commit and make sure CI tests pass
* [ ] test migration
* download, build and import previous version
* migrate using master version
* run updates using master version
* [ ] prepare tarball:
* `git clone --recursive https://github.com/osm-search/Nominatim` (switch to right branch!)
* `rm -r .git* osm2pgsql/.git*`
* copy country data into `data/`
* add version to base directory and package
* [ ] upload tarball to https://nominatim.org
* [ ] prepare documentation
* check out new docs branch
* change git checkout instructions to tarball download instructions or adapt version on existing ones
* build documentation and copy to https://github.com/osm-search/nominatim-org-site
* add new version to history
* [ ] check release tarball
* download tarball as per new documentation instructions
* compile and import Nominatim
* run `nominatim --version` to confirm correct version
* [ ] tag new release and add a release on github.com

View File

@@ -1,3 +1,60 @@
4.2.4
* fix a potential SQL injection in 'nominatim admin --collect-os-info'
* fix compatibility issue with PostGIS 3.4
4.2.3
* fix deletion handling for 'nominatim add-data'
* adapt place_force_delete() to new deletion handling
* flex style: avoid dropping of postcode areas
* fix update errors on address interpolation handling
4.2.2
* extend flex-style library to fully support all default styles
* fix handling of Hebrew aleph
* do not assign postcodes to rivers
* fix string matching in PHP code
* update osm2pgsql (various updates to flex)
* fix slow query when deleting places on update
* fix CLI details query
* fix recalculation of importance values
* fix polygon simplification in reverse results
* add class/type information to reverse geocodejson result
* minor improvements to default tokenizer configuration
* various smaller fixes to documentation
4.2.1
* fix XSS vulnerability in debug view
4.2.0
* add experimental support for osm2pgsql flex style
* introduce secondary importance value to be retrieved from a raster data file
(currently still unused, to replace address importance, thanks to @tareqpi)
* add new report tool `nominatim admin --collect-os-info`
(thanks @micahcochran, @tareqpi)
* reorganise index to improve lookup performance and size
* run index creation after import in parallel
* run ANALYZE more selectively to speed up continuation of indexing
* fix crash on update when addr:interpolation receives an illegal value
* fix minimum number of retrieved results to be at least 10
* fix search for combinations of special term + name (e.g Hotel Bellevue)
* do not return interpolations without a parent street on reverse search
* improve invalidation of linked places on updates
* fix address parsing for interpolation lines
* make sure socket timeouts are respected during replication
(working around a bug in some versions of pyosmium)
* update bundled osm2pgsql to 1.7.1
* add support for PostgreSQL 15
* typing fixes to work with latest type annotations from typeshed
* smaller improvements to documentation (thanks to @mausch)
4.1.1
* fix XSS vulnerability in debug view
4.1.0
* switch to ICU tokenizer as default
@@ -34,6 +91,10 @@
* add setup instructions for updates and systemd
* drop support for PostgreSQL 9.5
4.0.2
* fix XSS vulnerability in debug view
4.0.1
* fix initialisation error in replication script
@@ -72,6 +133,10 @@
* add testing of installation scripts via CI
* drop support for Python < 3.6 and Postgresql < 9.5
3.7.3
* fix XSS vulnerability in debug view
3.7.2
* fix database check for reverse-only imports

View File

@@ -9,10 +9,11 @@ versions.
| Version | End of support for security updates |
| ------- | ----------------------------------- |
| 4.2.x | 2024-11-24 |
| 4.1.x | 2024-08-05 |
| 4.0.x | 2023-11-02 |
| 3.7.x | 2023-04-05 |
| 3.6.x | 2022-12-12 |
| 3.5.x | 2022-06-05 |
## Reporting a Vulnerability

View File

@@ -1,6 +1,6 @@
# Install Nominatim in a virtual machine for development and testing
This document describes how you can install Nominatim inside a Ubuntu 16
This document describes how you can install Nominatim inside a Ubuntu 22
virtual machine on your desktop/laptop (host machine). The goal is to give
you a development environment to easily edit code and run the test suite
without affecting the rest of your system.
@@ -69,8 +69,7 @@ installation.
PHP errors are written to `/var/log/apache2/error.log`.
With `echo` and `var_dump()` you write into the output (HTML/XML/JSON) when
you either add `&debug=1` to the URL (preferred) or set
`@define('CONST_Debug', true);` in `settings/local.php`.
you either add `&debug=1` to the URL.
In the Python BDD test you can use `logger.info()` for temporary debug
statements.
@@ -130,6 +129,10 @@ and then
Yes, Vagrant and Virtualbox can be installed on MS Windows just fine. You need a 64bit
version of Windows.
##### Will it run on Apple Silicon?
You might need to replace Virtualbox with [Parallels](https://www.parallels.com/products/desktop/).
There is no free/open source version of Parallels.
##### Why Monaco, can I use another country?
@@ -141,11 +144,12 @@ No. Long running Nominatim installations will differ once new import features (o
bug fixes) get added since those usually only get applied to new/changed data.
Also this document skips the optional Wikipedia data import which affects ranking
of search results. See [Nominatim installation](https://nominatim.org/release-docs/latest/admin/Installation) for details.
of search results. See [Nominatim installation](https://nominatim.org/release-docs/latest/admin/Installation)
for details.
##### Why Ubuntu? Can I test CentOS/Fedora/CoreOS/FreeBSD?
There is a Vagrant script for CentOS available, but the Nominatim directory
There used to be a Vagrant script for CentOS available, but the Nominatim directory
isn't symlinked/mounted to the host which makes development trickier. We used
it mainly for debugging installation with SELinux.
@@ -154,14 +158,17 @@ are slightly different, e.g. the name of the package manager, Apache2 package
name, location of files. We chose Ubuntu because that is closest to the
nominatim.openstreetmap.org production environment.
You can configure/download other Vagrant boxes from [https://app.vagrantup.com/boxes/search](https://app.vagrantup.com/boxes/search).
You can configure/download other Vagrant boxes from
[https://app.vagrantup.com/boxes/search](https://app.vagrantup.com/boxes/search).
##### How can I connect to an existing database?
Let's say you have a Postgres database named `nominatim_it` on server `your-server.com` and port `5432`. The Postgres username is `postgres`. You can edit `settings/local.php` and point Nominatim to it.
Let's say you have a Postgres database named `nominatim_it` on server `your-server.com`
and port `5432`. The Postgres username is `postgres`. You can edit the `.env` in your
project directory and point Nominatim to it.
NOMINATIM_DATABASE_DSN="pgsql:host=your-server.com;port=5432;user=postgres;dbname=nominatim_it
pgsql:host=your-server.com;port=5432;user=postgres;dbname=nominatim_it
No data import or restarting necessary.
If the Postgres installation is behind a firewall, you can try
@@ -169,11 +176,12 @@ If the Postgres installation is behind a firewall, you can try
ssh -L 9999:localhost:5432 your-username@your-server.com
inside the virtual machine. It will map the port to `localhost:9999` and then
you edit `settings/local.php` with
you edit `.env` file with
@define('CONST_Database_DSN', 'pgsql:host=localhost;port=9999;user=postgres;dbname=nominatim_it');
NOMINATIM_DATABASE_DSN="pgsql:host=localhost;port=9999;user=postgres;dbname=nominatim_it"
To access postgres directly remember to specify the hostname, e.g. `psql --host localhost --port 9999 nominatim_it`
To access postgres directly remember to specify the hostname,
e.g. `psql --host localhost --port 9999 nominatim_it`
##### My computer is slow and the import takes too long. Can I start the virtual machine "in the cloud"?

View File

@@ -99,7 +99,7 @@ Unix socket instead, change the pool configuration
``` ini
; Replace the tcp listener and add the unix socket
listen = /var/run/php-fpm.sock
listen = /var/run/php-fpm-nominatim.sock
; Ensure that the daemon runs as the correct user
listen.owner = www-data
@@ -121,7 +121,7 @@ location @php {
fastcgi_param SCRIPT_FILENAME "$document_root$uri.php";
fastcgi_param PATH_TRANSLATED "$document_root$uri.php";
fastcgi_param QUERY_STRING $args;
fastcgi_pass unix:/var/run/php-fpm.sock;
fastcgi_pass unix:/var/run/php-fpm-nominatim.sock;
fastcgi_index index.php;
include fastcgi_params;
}
@@ -131,7 +131,7 @@ location ~ [^/]\.php(/|$) {
if (!-f $document_root$fastcgi_script_name) {
return 404;
}
fastcgi_pass unix:/var/run/php-fpm.sock;
fastcgi_pass unix:/var/run/php-fpm-nominatim.sock;
fastcgi_index search.php;
include fastcgi.conf;
}

View File

@@ -74,15 +74,15 @@ but it will improve the quality of the results if this is installed.
This data is available as a binary download. Put it into your project directory:
cd $PROJECT_DIR
wget https://www.nominatim.org/data/wikimedia-importance.sql.gz
wget https://nominatim.org/data/wikimedia-importance.sql.gz
The file is about 400MB and adds around 4GB to the Nominatim database.
!!! tip
If you forgot to download the wikipedia rankings, you can also add
importances after the import. Download the files, then run
`nominatim refresh --wiki-data --importance`. Updating importances for
a planet can take a couple of hours.
If you forgot to download the wikipedia rankings, then you can
also add importances after the import. Download the SQL files, then
run `nominatim refresh --wiki-data --importance`. Updating
importances for a planet will take a couple of hours.
### External postcodes
@@ -92,8 +92,8 @@ and the UK (using the [CodePoint OpenData set](https://osdatahub.os.uk/downloads
This data can be optionally downloaded into the project directory:
cd $PROJECT_DIR
wget https://www.nominatim.org/data/gb_postcodes.csv.gz
wget https://www.nominatim.org/data/us_postcodes.csv.gz
wget https://nominatim.org/data/gb_postcodes.csv.gz
wget https://nominatim.org/data/us_postcodes.csv.gz
You can also add your own custom postcode sources, see
[Customization of postcodes](../customize/Postcodes.md).
@@ -139,7 +139,7 @@ import. So this option is particularly interesting if you plan to transfer the
database or reuse the space later.
!!! warning
The datastructure for updates are also required when adding additional data
The data structure for updates are also required when adding additional data
after the import, for example [TIGER housenumber data](../customize/Tiger.md).
If you plan to use those, you must not use the `--no-updates` parameter.
Do a normal import, add the external data and once you are done with

View File

@@ -135,7 +135,7 @@ git clone --recursive https://github.com/openstreetmap/Nominatim.git
The development version does not include the country grid. Download it separately:
```
wget -O Nominatim/data/country_osm_grid.sql.gz https://www.nominatim.org/data/country_grid.sql.gz
wget -O Nominatim/data/country_osm_grid.sql.gz https://nominatim.org/data/country_grid.sql.gz
```
### Building Nominatim

View File

@@ -59,3 +59,27 @@ suited for these kinds of queries.
That said if you installed your own Nominatim instance you can use the
`nominatim export` PHP script as basis to return such lists.
#### 7. My result has a wrong postcode. Where does it come from?
Most places in OSM don't have a postcode, so Nominatim tries to interpolate
one. It first look at all the places that make up the address of the place.
If one of them has a postcode defined, this is the one to be used. When
none of the address parts has a postcode either, Nominatim interpolates one
from the surrounding objects. If the postcode is for your result is one, then
most of the time there is an OSM object with the wrong postcode nearby.
To find the bad postcode, go to
[https://nominatim.openstreetmap.org](https://nominatim.openstreetmap.org)
and search for your place. When you have found it, click on the 'details' link
under the result to go to the details page. There is a field 'Computed Postcode'
which should display the bad postcode. Click on the 'how?' link. A small
explanation text appears. It contains a link to a query for Overpass Turbo.
Click on that and you get a map with all places in the area that have the bad
postcode. If none is displayed, zoom the map out a bit and then click on 'Run'.
Now go to [OpenStreetMap](https://openstreetmap.org) and fix the error you
have just found. It will take at least a day for Nominatim to catch up with
your data fix. Sometimes longer, depending on how much editing activity is in
the area.

View File

@@ -211,8 +211,8 @@ be more than one. The attributes of that element contain:
* `ref` - content of `ref` tag if it exists
* `lat`, `lon` - latitude and longitude of the centroid of the object
* `boundingbox` - comma-separated list of corner coordinates ([see notes](#boundingbox))
* `place_rank` - class [search rank](../develop/Ranking#search-rank)
* `address_rank` - place [address rank](../develop/Ranking#address-rank)
* `place_rank` - class [search rank](../customize/Ranking.md#search-rank)
* `address_rank` - place [address rank](../customize/Ranking.md#address-rank)
* `display_name` - full comma-separated address
* `class`, `type` - key and value of the main OSM tag
* `importance` - computed importance rank

View File

@@ -35,7 +35,7 @@ Additional parameters are accepted as listed below.
!!! warning "Deprecation warning"
The reverse API used to allow address lookup for a single OSM object by
its OSM id. This use is now deprecated. Use the [Address Lookup API](../Lookup)
its OSM id. This use is now deprecated. Use the [Address Lookup API](Lookup.md)
instead.
### Output format

View File

@@ -57,10 +57,11 @@ code and message, e.g.
Possible status codes are
| | message | notes |
|-----|----------------------|---------------------------------------------------|
| 700 | "No database" | connection failed |
| 701 | "Module failed" | database could not load nominatim.so |
| 702 | "Module call failed" | nominatim.so loaded but calling a function failed |
| 703 | "Query failed" | test query against a database table failed |
| 704 | "No value" | test query worked but returned no results |
| | message | notes |
| --- | ------------------------------ | ----------------------------------------------------------------- |
| 700 | "No database" | connection failed |
| 701 | "Module failed" | database could not load nominatim.so |
| 702 | "Module call failed" | nominatim.so loaded but calling a function failed |
| 703 | "Query failed" | test query against a database table failed |
| 704 | "No value" | test query worked but returned no results |
| 705 | "Import date is not available" | No import dates were returned (enabling replication can fix this) |

View File

@@ -0,0 +1,49 @@
## Importance
Search requests can yield multiple results which match equally well with
the original query. In such case Nominatim needs to order the results
according to a different criterion: importance. This is a measure for how
likely it is that a user will search for a given place. This section explains
the sources Nominatim uses for computing importance of a place and how to
customize them.
### How importance is computed
The main value for importance is derived from page ranking values for Wikipedia
pages for a place. For places that do not have their own
Wikipedia page, a formula is used that derives a static importance from the
places [search rank](../customize/Ranking.md#search-rank).
In a second step, a secondary importance value is added which is meant to
represent how well-known the general area is where the place is located. It
functions as a tie-breaker between places with very similar primary
importance values.
nominatim.org has preprocessed importance tables for the
[primary Wikipedia rankings](https://nominatim.org/data/wikimedia-importance.sql.gz)
and for a secondary importance based on the number of tile views on openstreetmap.org.
### Customizing secondary importance
The secondary importance is implemented as a simple
[Postgis raster](https://postgis.net/docs/raster.html) table, where Nominatim
looks up the value for the coordinates of the centroid of a place. You can
provide your own secondary importance raster in form of an SQL file named
`secondary_importance.sql.gz` in your project directory.
The SQL file needs to drop and (re)create a table `secondary_importance` which
must as a minimum contain a column `rast` of type `raster`. The raster must
be in EPSG:4326 and contain 16bit unsigned ints
(`raster_constraint_pixel_types(rast) = '{16BUI}'). Any other columns in the
table will be ignored. You must furthermore create an index as follows:
```
CREATE INDEX ON secondary_importance USING gist(ST_ConvexHull(gist))
```
The following raster2pgsql command will create a table that conforms to
the requirements:
```
raster2pgsql -I -C -Y -d -t 128x128 input.tiff public.secondary_importance
```

View File

@@ -148,6 +148,29 @@ Setting this option to 'yes' means that Nominatim skips reindexing of contained
objects when the area becomes too large.
#### NOMINATIM_UPDATE_FORWARD_DEPENDENCIES
| Summary | |
| -------------- | --------------------------------------------------- |
| **Description:** | Forward geometry changes to dependet objects |
| **Format:** | bool |
| **Default:** | no |
| **Comment:** | EXPERT ONLY. Must not be enabled after import. |
The geometry of OSM ways and relations may change when a node that is part
of the object is moved around. These changes are not propagated per default.
The geometry of ways/relations is only updated the next time that the object
itself is touched. When enabling this option, then dependent objects will
be marked for update when one of its member objects changes.
Enabling this option may slow down updates significantly.
!!! warning
If you want to enable this option, it must be set already on import.
Do not enable this option on an existing database that was imported with
NOMINATIM_UPDATE_FORWARD_DEPENDENCIES=no.
Updates will become unusably slow.
#### NOMINATIM_LANGUAGES
| Summary | |
@@ -643,7 +666,7 @@ The entries in the log file have the following format:
<request time> <execution time in s> <number of results> <type> "<query string>"
Request time is the time when the request was started. The execution time is
given in ms and corresponds to the time the query took executing in PHP.
given in seconds and corresponds to the time the query took executing in PHP.
type contains the name of the endpoint used.
Can be used as the same time as NOMINATIM_LOG_DB.

View File

@@ -5,22 +5,22 @@ address set to complement the OSM house number data in the US. You can add
TIGER data to your own Nominatim instance by following these steps. The
entire US adds about 10GB to your database.
1. Get preprocessed TIGER 2021 data:
1. Get preprocessed TIGER data:
cd $PROJECT_DIR
wget https://nominatim.org/data/tiger2021-nominatim-preprocessed.csv.tar.gz
wget https://nominatim.org/data/tiger-nominatim-preprocessed-latest.csv.tar.gz
2. Import the data into your Nominatim database:
nominatim add-data --tiger-data tiger2021-nominatim-preprocessed.csv.tar.gz
nominatim add-data --tiger-data tiger-nominatim-preprocessed-latest.csv.tar.gz
3. Enable use of the Tiger data in your `.env` by adding:
3. Enable use of the Tiger data in your existing `.env` file by adding:
echo NOMINATIM_USE_US_TIGER_DATA=yes >> .env
4. Apply the new settings:
nominatim refresh --functions
nominatim refresh --functions --website
See the [TIGER-data project](https://github.com/osm-search/TIGER-data) for more

View File

@@ -213,6 +213,15 @@ The following is a list of sanitizers that are shipped with Nominatim.
rendering:
heading_level: 6
##### clean-tiger-tags
::: nominatim.tokenizer.sanitizers.clean_tiger_tags
selection:
members: False
rendering:
heading_level: 6
#### Token Analysis

View File

@@ -55,8 +55,8 @@ To install all necessary packages run:
sudo apt install php-cgi phpunit php-codesniffer \
python3-pip python3-setuptools python3-dev
pip3 install --user behave mkdocs mkdocstrings pytest \
pylint mypy types-PyYAML types-jinja2 types-psycopg2
pip3 install --user behave mkdocs mkdocstrings pytest pylint \
mypy types-PyYAML types-jinja2 types-psycopg2 types-psutil
```
The `mkdocs` executable will be located in `.local/bin`. You may have to add

View File

@@ -30,6 +30,7 @@ nav:
- 'Configuration Settings': 'customize/Settings.md'
- 'Per-Country Data': 'customize/Country-Settings.md'
- 'Place Ranking' : 'customize/Ranking.md'
- 'Importance' : 'customize/Importance.md'
- 'Tokenizers' : 'customize/Tokenizers.md'
- 'Special Phrases': 'customize/Special-Phrases.md'
- 'External data: US housenumbers from TIGER': 'customize/Tiger.md'

View File

@@ -135,7 +135,7 @@ class Debug
public static function printSQL($sSQL)
{
echo '<p><tt><font color="#aaa">'.$sSQL.'</font></tt></p>'."\n";
echo '<p><tt><font color="#aaa">'.htmlspecialchars($sSQL, ENT_QUOTES | ENT_SUBSTITUTE | ENT_HTML401).'</font></tt></p>'."\n";
}
private static function outputVar($mVar, $sPreNL)
@@ -178,11 +178,12 @@ class Debug
}
if (is_string($mVar)) {
echo "'$mVar'";
return strlen($mVar) + 2;
$sOut = "'$mVar'";
} else {
$sOut = (string)$mVar;
}
echo (string)$mVar;
return strlen((string)$mVar);
echo htmlspecialchars($sOut, ENT_QUOTES | ENT_SUBSTITUTE | ENT_HTML401);
return strlen($sOut);
}
}

View File

@@ -103,7 +103,7 @@ class Geocode
}
$this->iFinalLimit = $iLimit;
$this->iLimit = $iLimit + min($iLimit, 10);
$this->iLimit = $iLimit + max($iLimit, 10);
}
public function setFeatureType($sFeatureType)
@@ -874,7 +874,7 @@ class Geocode
$iCountWords = 0;
$sAddress = $aResult['langaddress'];
foreach ($aRecheckWords as $i => $sWord) {
if (stripos($sAddress, $sWord)!==false) {
if (grapheme_stripos($sAddress, $sWord)!==false) {
$iCountWords++;
if (preg_match('/(^|,)\s*'.preg_quote($sWord, '/').'\s*(,|$)/', $sAddress)) {
$iCountWords += 0.1;

View File

@@ -187,12 +187,12 @@ class PlaceLookup
return null;
}
$aResults = $this->lookup(array($iPlaceID => new Result($iPlaceID)));
$aResults = $this->lookup(array($iPlaceID => new Result($iPlaceID)), 0, 30, true);
return empty($aResults) ? null : reset($aResults);
}
public function lookup($aResults, $iMinRank = 0, $iMaxRank = 30)
public function lookup($aResults, $iMinRank = 0, $iMaxRank = 30, $bAllowLinked = false)
{
Debug::newFunction('Place lookup');
@@ -247,7 +247,9 @@ class PlaceLookup
if ($this->sAllowedTypesSQLList) {
$sSQL .= 'AND placex.class in '.$this->sAllowedTypesSQLList;
}
$sSQL .= ' AND linked_place_id is null ';
if (!$bAllowLinked) {
$sSQL .= ' AND linked_place_id is null ';
}
$sSQL .= ' GROUP BY ';
$sSQL .= ' osm_type, ';
$sSQL .= ' osm_id, ';
@@ -522,12 +524,7 @@ class PlaceLookup
// Get the bounding box and outline polygon
$sSQL = 'select place_id,0 as numfeatures,st_area(geometry) as area,';
if ($fLonReverse != null && $fLatReverse != null) {
$sSQL .= ' ST_Y(closest_point) as centrelat,';
$sSQL .= ' ST_X(closest_point) as centrelon,';
} else {
$sSQL .= ' ST_Y(centroid) as centrelat, ST_X(centroid) as centrelon,';
}
$sSQL .= ' ST_Y(centroid) as centrelat, ST_X(centroid) as centrelon,';
$sSQL .= ' ST_YMin(geometry) as minlat,ST_YMax(geometry) as maxlat,';
$sSQL .= ' ST_XMin(geometry) as minlon,ST_XMax(geometry) as maxlon';
if ($this->bIncludePolygonAsGeoJSON) {
@@ -542,19 +539,21 @@ class PlaceLookup
if ($this->bIncludePolygonAsText) {
$sSQL .= ',ST_AsText(geometry) as astext';
}
$sSQL .= ' FROM (SELECT place_id';
if ($fLonReverse != null && $fLatReverse != null) {
$sFrom = ' from (SELECT * , CASE WHEN (class = \'highway\') AND (ST_GeometryType(geometry) = \'ST_LineString\') THEN ';
$sFrom .=' ST_ClosestPoint(geometry, ST_SetSRID(ST_Point('.$fLatReverse.','.$fLonReverse.'),4326))';
$sFrom .=' ELSE centroid END AS closest_point';
$sFrom .= ' from placex where place_id = '.$iPlaceID.') as plx';
$sSQL .= ',CASE WHEN (class = \'highway\') AND (ST_GeometryType(geometry) = \'ST_LineString\') THEN ';
$sSQL .=' ST_ClosestPoint(geometry, ST_SetSRID(ST_Point('.$fLatReverse.','.$fLonReverse.'),4326))';
$sSQL .=' ELSE centroid END AS centroid';
} else {
$sFrom = ' from placex where place_id = '.$iPlaceID;
$sSQL .= ',centroid';
}
if ($this->fPolygonSimplificationThreshold > 0) {
$sSQL .= ' from (select place_id,centroid,ST_SimplifyPreserveTopology(geometry,'.$this->fPolygonSimplificationThreshold.') as geometry'.$sFrom.') as plx';
$sSQL .= ',ST_SimplifyPreserveTopology(geometry,'.$this->fPolygonSimplificationThreshold.') as geometry';
} else {
$sSQL .= $sFrom;
$sSQL .= ',geometry';
}
$sSQL .= ' FROM placex where place_id = '.$iPlaceID.') as plx';
$aPointPolygon = $this->oDB->getRow($sSQL, null, 'Could not get outline');

View File

@@ -71,7 +71,8 @@ class ReverseGeocode
$sSQL .= ' ST_Distance(linegeo,'.$sPointSQL.') as distance';
$sSQL .= ' FROM location_property_osmline';
$sSQL .= ' WHERE ST_DWithin('.$sPointSQL.', linegeo, '.$fSearchDiam.')';
$sSQL .= ' and indexed_status = 0 and startnumber is not NULL ';
$sSQL .= ' and indexed_status = 0 and startnumber is not NULL ';
$sSQL .= ' and parent_place_id != 0';
$sSQL .= ' ORDER BY distance ASC limit 1';
Debug::printSQL($sSQL);
@@ -188,14 +189,16 @@ class ReverseGeocode
$sSQL .= '(select place_id, parent_place_id, rank_address, rank_search, country_code, geometry';
$sSQL .= ' FROM placex';
$sSQL .= ' WHERE ST_GeometryType(geometry) in (\'ST_Polygon\', \'ST_MultiPolygon\')';
$sSQL .= ' AND rank_address Between 5 AND ' .$iMaxRank;
// Ensure that query planner doesn't use the index on rank_search.
$sSQL .= ' AND coalesce(rank_search, 0) between 5 and ' .$iMaxRank;
$sSQL .= ' AND rank_address between 4 and 25'; // needed for index selection
$sSQL .= ' AND geometry && '.$sPointSQL;
$sSQL .= ' AND type != \'postcode\' ';
$sSQL .= ' AND name is not null';
$sSQL .= ' AND indexed_status = 0 and linked_place_id is null';
$sSQL .= ' ORDER BY rank_address DESC LIMIT 50 ) as a';
$sSQL .= ' WHERE ST_CONTAINS(geometry, '.$sPointSQL.' )';
$sSQL .= ' ORDER BY rank_address DESC LIMIT 1';
$sSQL .= ' ORDER BY rank_search DESC LIMIT 50 ) as a';
$sSQL .= ' WHERE ST_Contains(geometry, '.$sPointSQL.' )';
$sSQL .= ' ORDER BY rank_search DESC LIMIT 1';
Debug::printSQL($sSQL);
$aPoly = $this->oDB->getRow($sSQL, null, 'Could not determine polygon containing the point.');
@@ -207,7 +210,7 @@ class ReverseGeocode
$iRankSearch = $aPoly['rank_search'];
$iPlaceID = $aPoly['place_id'];
if ($iRankAddress != $iMaxRank) {
if ($iRankSearch != $iMaxRank) {
$sSQL = 'SELECT place_id FROM ';
$sSQL .= '(SELECT place_id, rank_search, country_code, geometry,';
$sSQL .= ' ST_distance('.$sPointSQL.', geometry) as distance';

View File

@@ -69,19 +69,31 @@ class SpecialTerm
*/
public function extendSearch($oSearch, $oPosition)
{
$iSearchCost = 2;
$iSearchCost = 0;
$iOp = $this->iOperator;
if ($iOp == \Nominatim\Operator::NONE) {
if ($oSearch->hasName() || $oSearch->getContext()->isBoundedSearch()) {
if ($oPosition->isFirstToken()
|| $oSearch->hasName()
|| $oSearch->getContext()->isBoundedSearch()
) {
$iOp = \Nominatim\Operator::NAME;
$iSearchCost += 3;
} else {
$iOp = \Nominatim\Operator::NEAR;
$iSearchCost += 2;
$iSearchCost += 4;
if (!$oPosition->isFirstToken()) {
$iSearchCost += 3;
}
}
} elseif (!$oPosition->isFirstToken() && !$oPosition->isLastToken()) {
} elseif ($oPosition->isFirstToken()) {
$iSearchCost += 2;
} elseif ($oPosition->isLastToken()) {
$iSearchCost += 4;
} else {
$iSearchCost += 6;
}
if ($oSearch->hasHousenumber()) {
$iSearchCost ++;
}

View File

@@ -36,6 +36,9 @@ if (empty($aPlace)) {
$aFilteredPlaces['properties']['geocoding']['osm_id'] = $aPlace['osm_id'];
}
$aFilteredPlaces['properties']['geocoding']['osm_key'] = $aPlace['class'];
$aFilteredPlaces['properties']['geocoding']['osm_value'] = $aPlace['type'];
$aFilteredPlaces['properties']['geocoding']['type'] = addressRankToGeocodeJsonType($aPlace['rank_address']);
$aFilteredPlaces['properties']['geocoding']['accuracy'] = (int) $fDistance;

View File

@@ -100,32 +100,55 @@ LANGUAGE plpgsql STABLE;
CREATE OR REPLACE FUNCTION compute_importance(extratags HSTORE,
country_code varchar(2),
osm_type varchar(1), osm_id BIGINT)
rank_search SMALLINT,
centroid GEOMETRY)
RETURNS place_importance
AS $$
DECLARE
match RECORD;
result place_importance;
osm_views_exists BIGINT;
views BIGINT;
BEGIN
FOR match IN SELECT * FROM get_wikipedia_match(extratags, country_code)
WHERE language is not NULL
-- add importance by wikipedia article if the place has one
FOR match IN
SELECT * FROM get_wikipedia_match(extratags, country_code)
WHERE language is not NULL
LOOP
result.importance := match.importance;
result.wikipedia := match.language || ':' || match.title;
RETURN result;
END LOOP;
IF extratags ? 'wikidata' THEN
-- Nothing? Then try with the wikidata tag.
IF result.importance is null AND extratags ? 'wikidata' THEN
FOR match IN SELECT * FROM wikipedia_article
WHERE wd_page_title = extratags->'wikidata'
ORDER BY language = 'en' DESC, langcount DESC LIMIT 1 LOOP
ORDER BY language = 'en' DESC, langcount DESC LIMIT 1
LOOP
result.importance := match.importance;
result.wikipedia := match.language || ':' || match.title;
RETURN result;
END LOOP;
END IF;
RETURN null;
-- Still nothing? Fall back to a default.
IF result.importance is null THEN
result.importance := 0.75001 - (rank_search::float / 40);
END IF;
{% if 'secondary_importance' in db.tables %}
FOR match IN
SELECT ST_Value(rast, centroid) as importance
FROM secondary_importance
WHERE ST_Intersects(ST_ConvexHull(rast), centroid) LIMIT 1
LOOP
-- Secondary importance as tie breaker with 0.0001 weight.
result.importance := result.importance + match.importance::float / 655350000;
END LOOP;
{% endif %}
RETURN result;
END;
$$
LANGUAGE plpgsql;

View File

@@ -15,7 +15,7 @@ DECLARE
location RECORD;
waynodes BIGINT[];
BEGIN
IF akeys(in_address) != ARRAY['interpolation'] THEN
IF in_address ? 'street' or in_address ? 'place' THEN
RETURN in_address;
END IF;
@@ -52,7 +52,9 @@ BEGIN
IF parent_place_id is null THEN
FOR location IN SELECT place_id FROM placex
WHERE ST_DWithin(geom, placex.geometry, 0.001) and placex.rank_search = 26
WHERE ST_DWithin(geom, placex.geometry, 0.001)
and placex.rank_search = 26
and placex.osm_type = 'W' -- needed for index selection
ORDER BY CASE WHEN ST_GeometryType(geom) = 'ST_Line' THEN
(ST_distance(placex.geometry, ST_LineInterpolatePoint(geom,0))+
ST_distance(placex.geometry, ST_LineInterpolatePoint(geom,0.5))+
@@ -82,27 +84,35 @@ CREATE OR REPLACE FUNCTION reinsert_interpolation(way_id BIGINT, addr HSTORE,
DECLARE
existing BIGINT[];
BEGIN
-- Get the existing entry from the interpolation table.
SELECT array_agg(place_id) INTO existing
FROM location_property_osmline WHERE osm_id = way_id;
IF existing IS NULL or array_length(existing, 1) = 0 THEN
INSERT INTO location_property_osmline (osm_id, address, linegeo)
VALUES (way_id, addr, geom);
IF addr is NULL OR NOT addr ? 'interpolation'
OR NOT (addr->'interpolation' in ('odd', 'even', 'all')
or addr->'interpolation' similar to '[1-9]')
THEN
-- the new interpolation is illegal, simply remove existing entries
DELETE FROM location_property_osmline WHERE osm_id = way_id;
ELSE
-- Update the interpolation table:
-- The first entry gets the original data, all other entries
-- are removed and will be recreated on indexing.
-- (An interpolation can be split up, if it has more than 2 address nodes)
UPDATE location_property_osmline
SET address = addr,
linegeo = geom,
startnumber = null,
indexed_status = 1
WHERE place_id = existing[1];
IF array_length(existing, 1) > 1 THEN
DELETE FROM location_property_osmline
WHERE place_id = any(existing[2:]);
-- Get the existing entry from the interpolation table.
SELECT array_agg(place_id) INTO existing
FROM location_property_osmline WHERE osm_id = way_id;
IF existing IS NULL or array_length(existing, 1) = 0 THEN
INSERT INTO location_property_osmline (osm_id, address, linegeo)
VALUES (way_id, addr, geom);
ELSE
-- Update the interpolation table:
-- The first entry gets the original data, all other entries
-- are removed and will be recreated on indexing.
-- (An interpolation can be split up, if it has more than 2 address nodes)
UPDATE location_property_osmline
SET address = addr,
linegeo = geom,
startnumber = null,
indexed_status = 1
WHERE place_id = existing[1];
IF array_length(existing, 1) > 1 THEN
DELETE FROM location_property_osmline
WHERE place_id = any(existing[2:]);
END IF;
END IF;
END IF;
@@ -154,7 +164,7 @@ DECLARE
newend INTEGER;
moddiff SMALLINT;
linegeo GEOMETRY;
splitline GEOMETRY;
splitpoint FLOAT;
sectiongeo GEOMETRY;
postcode TEXT;
stepmod SMALLINT;
@@ -213,15 +223,27 @@ BEGIN
FROM placex, generate_series(1, array_upper(waynodes, 1)) nodeidpos
WHERE osm_type = 'N' and osm_id = waynodes[nodeidpos]::BIGINT
and address is not NULL and address ? 'housenumber'
and ST_Distance(NEW.linegeo, geometry) < 0.0005
ORDER BY nodeidpos
LOOP
{% if debug %}RAISE WARNING 'processing point % (%)', nextnode.hnr, ST_AsText(nextnode.geometry);{% endif %}
IF linegeo is null THEN
linegeo := NEW.linegeo;
ELSE
splitline := ST_Split(ST_Snap(linegeo, nextnode.geometry, 0.0005), nextnode.geometry);
sectiongeo := ST_GeometryN(splitline, 1);
linegeo := ST_GeometryN(splitline, 2);
splitpoint := ST_LineLocatePoint(linegeo, nextnode.geometry);
IF splitpoint = 0 THEN
-- Corner case where the splitpoint falls on the first point
-- and thus would not return a geometry. Skip that section.
sectiongeo := NULL;
ELSEIF splitpoint = 1 THEN
-- Point is at the end of the line.
sectiongeo := linegeo;
linegeo := NULL;
ELSE
-- Split the line.
sectiongeo := ST_LineSubstring(linegeo, 0, splitpoint);
linegeo := ST_LineSubstring(linegeo, splitpoint, 1);
END IF;
END IF;
IF prevnode.hnr is not null
@@ -229,6 +251,9 @@ BEGIN
-- regularly mapped housenumbers.
-- (Conveniently also fails if one of the house numbers is not a number.)
and abs(prevnode.hnr - nextnode.hnr) > NEW.step
-- If the interpolation geometry is broken or two nodes are at the
-- same place, then splitting might produce a point. Ignore that.
and ST_GeometryType(sectiongeo) = 'ST_LineString'
THEN
IF prevnode.hnr < nextnode.hnr THEN
startnumber := prevnode.hnr;
@@ -290,12 +315,12 @@ BEGIN
NEW.address, postcode,
NEW.country_code, NEW.geometry_sector, 0);
END IF;
END IF;
-- early break if we are out of line string,
-- might happen when a line string loops back on itself
IF ST_GeometryType(linegeo) != 'ST_LineString' THEN
RETURN NEW;
END IF;
-- early break if we are out of line string,
-- might happen when a line string loops back on itself
IF linegeo is null or ST_GeometryType(linegeo) != 'ST_LineString' THEN
RETURN NEW;
END IF;
prevnode := nextnode;

View File

@@ -34,6 +34,11 @@ BEGIN
RETURN null;
END IF;
-- Remove the place from the list of places to be deleted
DELETE FROM place_to_be_deleted pdel
WHERE pdel.osm_type = NEW.osm_type and pdel.osm_id = NEW.osm_id
and pdel.class = NEW.class;
-- Have we already done this place?
SELECT * INTO existing
FROM place
@@ -42,8 +47,6 @@ BEGIN
{% if debug %}RAISE WARNING 'Existing: %',existing.osm_id;{% endif %}
-- Handle a place changing type by removing the old data.
-- (This trigger is executed BEFORE INSERT of the NEW tuple.)
IF existing.osm_type IS NULL THEN
DELETE FROM place where osm_type = NEW.osm_type and osm_id = NEW.osm_id and class = NEW.class;
END IF;
@@ -187,15 +190,11 @@ BEGIN
END IF;
{% endif %}
IF existing.osm_type IS NOT NULL THEN
-- Pathological case caused by the triggerless copy into place during initial import
-- force delete even for large areas, it will be reinserted later
UPDATE place SET geometry = ST_SetSRID(ST_Point(0,0), 4326)
WHERE osm_type = NEW.osm_type and osm_id = NEW.osm_id
and class = NEW.class and type = NEW.type;
DELETE FROM place
WHERE osm_type = NEW.osm_type and osm_id = NEW.osm_id
and class = NEW.class and type = NEW.type;
IF existingplacex.osm_type is not NULL THEN
-- Mark any existing place for delete in the placex table
UPDATE placex SET indexed_status = 100
WHERE placex.osm_type = NEW.osm_type and placex.osm_id = NEW.osm_id
and placex.class = NEW.class and placex.type = NEW.type;
END IF;
-- Process it as a new insertion
@@ -206,6 +205,27 @@ BEGIN
{% if debug %}RAISE WARNING 'insert done % % % % %',NEW.osm_type,NEW.osm_id,NEW.class,NEW.type,NEW.name;{% endif %}
IF existing.osm_type is not NULL THEN
-- If there is already an entry in place, just update that, if necessary.
IF coalesce(existing.name, ''::hstore) != coalesce(NEW.name, ''::hstore)
or coalesce(existing.address, ''::hstore) != coalesce(NEW.address, ''::hstore)
or coalesce(existing.extratags, ''::hstore) != coalesce(NEW.extratags, ''::hstore)
or coalesce(existing.admin_level, 15) != coalesce(NEW.admin_level, 15)
or existing.geometry::text != NEW.geometry::text
THEN
UPDATE place
SET name = NEW.name,
address = NEW.address,
extratags = NEW.extratags,
admin_level = NEW.admin_level,
geometry = NEW.geometry
WHERE osm_type = NEW.osm_type and osm_id = NEW.osm_id
and class = NEW.class and type = NEW.type;
END IF;
RETURN NULL;
END IF;
RETURN NEW;
END IF;
@@ -321,35 +341,79 @@ BEGIN
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE FUNCTION place_delete()
RETURNS TRIGGER
AS $$
DECLARE
has_rank BOOLEAN;
deferred BOOLEAN;
BEGIN
{% if debug %}RAISE WARNING 'Delete for % % %/%', OLD.osm_type, OLD.osm_id, OLD.class, OLD.type;{% endif %}
{% if debug %}RAISE WARNING 'delete: % % % %',OLD.osm_type,OLD.osm_id,OLD.class,OLD.type;{% endif %}
-- deleting large polygons can have a massive effect on the system - require manual intervention to let them through
IF st_area(OLD.geometry) > 2 and st_isvalid(OLD.geometry) THEN
SELECT bool_or(not (rank_address = 0 or rank_address > 25)) as ranked FROM placex WHERE osm_type = OLD.osm_type and osm_id = OLD.osm_id and class = OLD.class and type = OLD.type INTO has_rank;
IF has_rank THEN
insert into import_polygon_delete (osm_type, osm_id, class, type) values (OLD.osm_type,OLD.osm_id,OLD.class,OLD.type);
RETURN NULL;
END IF;
deferred := ST_IsValid(OLD.geometry) and ST_Area(OLD.geometry) > 2;
IF deferred THEN
SELECT bool_or(not (rank_address = 0 or rank_address > 25)) INTO deferred
FROM placex
WHERE osm_type = OLD.osm_type and osm_id = OLD.osm_id
and class = OLD.class and type = OLD.type;
END IF;
-- mark for delete
UPDATE placex set indexed_status = 100 where osm_type = OLD.osm_type and osm_id = OLD.osm_id and class = OLD.class and type = OLD.type;
INSERT INTO place_to_be_deleted (osm_type, osm_id, class, type, deferred)
VALUES(OLD.osm_type, OLD.osm_id, OLD.class, OLD.type, deferred);
-- interpolations are special
IF OLD.osm_type='W' and OLD.class = 'place' and OLD.type = 'houses' THEN
UPDATE location_property_osmline set indexed_status = 100 where osm_id = OLD.osm_id; -- osm_id = wayid (=old.osm_id)
END IF;
RETURN OLD;
RETURN NULL;
END;
$$
LANGUAGE plpgsql;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE FUNCTION flush_deleted_places()
RETURNS INTEGER
AS $$
BEGIN
-- deleting large polygons can have a massive effect on the system - require manual intervention to let them through
INSERT INTO import_polygon_delete (osm_type, osm_id, class, type)
SELECT osm_type, osm_id, class, type FROM place_to_be_deleted WHERE deferred;
-- delete from place table
ALTER TABLE place DISABLE TRIGGER place_before_delete;
DELETE FROM place USING place_to_be_deleted
WHERE place.osm_type = place_to_be_deleted.osm_type
and place.osm_id = place_to_be_deleted.osm_id
and place.class = place_to_be_deleted.class
and place.type = place_to_be_deleted.type
and not deferred;
ALTER TABLE place ENABLE TRIGGER place_before_delete;
-- Mark for delete in the placex table
UPDATE placex SET indexed_status = 100 FROM place_to_be_deleted
WHERE placex.osm_type = 'N' and place_to_be_deleted.osm_type = 'N'
and placex.osm_id = place_to_be_deleted.osm_id
and placex.class = place_to_be_deleted.class
and placex.type = place_to_be_deleted.type
and not deferred;
UPDATE placex SET indexed_status = 100 FROM place_to_be_deleted
WHERE placex.osm_type = 'W' and place_to_be_deleted.osm_type = 'W'
and placex.osm_id = place_to_be_deleted.osm_id
and placex.class = place_to_be_deleted.class
and placex.type = place_to_be_deleted.type
and not deferred;
UPDATE placex SET indexed_status = 100 FROM place_to_be_deleted
WHERE placex.osm_type = 'R' and place_to_be_deleted.osm_type = 'R'
and placex.osm_id = place_to_be_deleted.osm_id
and placex.class = place_to_be_deleted.class
and placex.type = place_to_be_deleted.type
and not deferred;
-- Mark for delete in interpolations
UPDATE location_property_osmline SET indexed_status = 100 FROM place_to_be_deleted
WHERE place_to_be_deleted.osm_type = 'W'
and place_to_be_deleted.class = 'place'
and place_to_be_deleted.type = 'houses'
and location_property_osmline.osm_id = place_to_be_deleted.osm_id
and not deferred;
-- Clear todo list.
TRUNCATE TABLE place_to_be_deleted;
RETURN NULL;
END;
$$ LANGUAGE plpgsql;

View File

@@ -107,12 +107,17 @@ LANGUAGE plpgsql STABLE;
CREATE OR REPLACE FUNCTION find_associated_street(poi_osm_type CHAR(1),
poi_osm_id BIGINT)
poi_osm_id BIGINT,
bbox GEOMETRY)
RETURNS BIGINT
AS $$
DECLARE
location RECORD;
parent RECORD;
result BIGINT;
distance FLOAT;
new_distance FLOAT;
waygeom GEOMETRY;
BEGIN
FOR location IN
SELECT members FROM planet_osm_rels
@@ -123,19 +128,34 @@ BEGIN
FOR i IN 1..array_upper(location.members, 1) BY 2 LOOP
IF location.members[i+1] = 'street' THEN
FOR parent IN
SELECT place_id from placex
SELECT place_id, geometry
FROM placex
WHERE osm_type = upper(substring(location.members[i], 1, 1))::char(1)
and osm_id = substring(location.members[i], 2)::bigint
and name is not null
and rank_search between 26 and 27
LOOP
RETURN parent.place_id;
-- Find the closest 'street' member.
-- Avoid distance computation for the frequent case where there is
-- only one street member.
IF waygeom is null THEN
result := parent.place_id;
waygeom := parent.geometry;
ELSE
distance := coalesce(distance, ST_Distance(waygeom, bbox));
new_distance := ST_Distance(parent.geometry, bbox);
IF new_distance < distance THEN
distance := new_distance;
result := parent.place_id;
waygeom := parent.geometry;
END IF;
END IF;
END LOOP;
END IF;
END LOOP;
END LOOP;
RETURN NULL;
RETURN result;
END;
$$
LANGUAGE plpgsql STABLE;
@@ -162,7 +182,7 @@ BEGIN
{% if debug %}RAISE WARNING 'finding street for % %', poi_osm_type, poi_osm_id;{% endif %}
-- Is this object part of an associatedStreet relation?
parent_place_id := find_associated_street(poi_osm_type, poi_osm_id);
parent_place_id := find_associated_street(poi_osm_type, poi_osm_id, bbox);
IF parent_place_id is null THEN
parent_place_id := find_parent_for_address(token_info, poi_partition, bbox);
@@ -185,7 +205,7 @@ BEGIN
RETURN location.place_id;
END IF;
parent_place_id := find_associated_street('W', location.osm_id);
parent_place_id := find_associated_street('W', location.osm_id, bbox);
END LOOP;
END IF;
@@ -197,6 +217,7 @@ BEGIN
SELECT place_id FROM placex
WHERE bbox && geometry AND _ST_Covers(geometry, ST_Centroid(bbox))
AND rank_address between 5 and 25
AND ST_GeometryType(geometry) in ('ST_Polygon','ST_MultiPolygon')
ORDER BY rank_address desc
LOOP
RETURN location.place_id;
@@ -212,6 +233,7 @@ BEGIN
SELECT place_id FROM placex
WHERE bbox && geometry AND _ST_Covers(geometry, ST_Centroid(bbox))
AND rank_address between 5 and 25
AND ST_GeometryType(geometry) in ('ST_Polygon','ST_MultiPolygon')
ORDER BY rank_address desc
LOOP
RETURN location.place_id;
@@ -275,7 +297,9 @@ BEGIN
-- If extratags has a place tag, look for linked nodes by their place type.
-- Area and node still have to have the same name.
IF bnd.extratags ? 'place' and bnd_name is not null THEN
IF bnd.extratags ? 'place' and bnd.extratags->'place' != 'postcode'
and bnd_name is not null
THEN
FOR linked_placex IN
SELECT * FROM placex
WHERE (position(lower(name->'name') in bnd_name) > 0
@@ -284,7 +308,6 @@ BEGIN
AND placex.osm_type = 'N'
AND (placex.linked_place_id is null or placex.linked_place_id = bnd.place_id)
AND placex.rank_search < 26 -- needed to select the right index
AND placex.type != 'postcode'
AND ST_Covers(bnd.geometry, placex.geometry)
LOOP
{% if debug %}RAISE WARNING 'Found type-matching place node %', linked_placex.osm_id;{% endif %}
@@ -846,7 +869,8 @@ BEGIN
FROM placex
WHERE osm_type = 'R' and class = 'boundary' and type = 'administrative'
and admin_level < NEW.admin_level and admin_level > 3
and rank_address > 0
and rank_address between 1 and 25 -- for index selection
and ST_GeometryType(geometry) in ('ST_Polygon','ST_MultiPolygon') -- for index selection
and geometry && NEW.centroid and _ST_Covers(geometry, NEW.centroid)
ORDER BY admin_level desc LIMIT 1
LOOP
@@ -874,8 +898,9 @@ BEGIN
FROM placex,
LATERAL compute_place_rank(country_code, 'A', class, type,
admin_level, False, null) prank
WHERE class = 'place' and rank_address < 24
WHERE class = 'place' and rank_address between 1 and 23
and prank.address_rank >= NEW.rank_address
and ST_GeometryType(geometry) in ('ST_Polygon','ST_MultiPolygon') -- select right index
and geometry && NEW.geometry
and geometry ~ NEW.geometry -- needed because ST_Relate does not do bbox cover test
and ST_Relate(geometry, NEW.geometry, 'T*T***FF*') -- contains but not equal
@@ -896,6 +921,8 @@ BEGIN
LATERAL compute_place_rank(country_code, 'A', class, type,
admin_level, False, null) prank
WHERE prank.address_rank < 24
and rank_address between 1 and 25 -- select right index
and ST_GeometryType(geometry) in ('ST_Polygon','ST_MultiPolygon') -- select right index
and prank.address_rank >= NEW.rank_address
and geometry && NEW.geometry
and geometry ~ NEW.geometry -- needed because ST_Relate does not do bbox cover test
@@ -916,7 +943,10 @@ BEGIN
LATERAL compute_place_rank(country_code, 'A', class, type,
admin_level, False, null) prank
WHERE osm_type = 'R'
and prank.address_rank = NEW.rank_address
and rank_address between 1 and 25 -- select right index
and ST_GeometryType(geometry) in ('ST_Polygon','ST_MultiPolygon') -- select right index
and ((class = 'place' and prank.address_rank = NEW.rank_address)
or (class = 'boundary' and rank_address = NEW.rank_address))
and geometry && NEW.centroid and _ST_Covers(geometry, NEW.centroid)
LIMIT 1
LOOP
@@ -955,7 +985,7 @@ BEGIN
NEW.importance := null;
SELECT wikipedia, importance
FROM compute_importance(NEW.extratags, NEW.country_code, NEW.osm_type, NEW.osm_id)
FROM compute_importance(NEW.extratags, NEW.country_code, NEW.rank_search, NEW.centroid)
INTO NEW.wikipedia,NEW.importance;
{% if debug %}RAISE WARNING 'Importance computed from wikipedia: %', NEW.importance;{% endif %}
@@ -1037,7 +1067,7 @@ BEGIN
IF linked_place is not null THEN
-- Recompute the ranks here as the ones from the linked place might
-- have been shifted to accommodate surrounding boundaries.
SELECT place_id, osm_id, class, type, extratags,
SELECT place_id, osm_id, class, type, extratags, rank_search,
centroid, geometry,
(compute_place_rank(country_code, osm_type, class, type, admin_level,
(extratags->'capital') = 'yes', null)).*
@@ -1078,7 +1108,7 @@ BEGIN
SELECT wikipedia, importance
FROM compute_importance(location.extratags, NEW.country_code,
'N', location.osm_id)
location.rank_search, NEW.centroid)
INTO linked_wikipedia,linked_importance;
-- Use the maximum importance if one could be computed from the linked object.
@@ -1090,7 +1120,7 @@ BEGIN
ELSE
-- No linked place? As a last resort check if the boundary is tagged with
-- a place type and adapt the rank address.
IF NEW.rank_address > 0 and NEW.extratags ? 'place' THEN
IF NEW.rank_address between 4 and 25 and NEW.extratags ? 'place' THEN
SELECT address_rank INTO place_address_level
FROM compute_place_rank(NEW.country_code, 'A', 'place',
NEW.extratags->'place', 0::SMALLINT, False, null);
@@ -1101,6 +1131,15 @@ BEGIN
END IF;
END IF;
{% if not disable_diff_updates %}
IF OLD.rank_address != NEW.rank_address THEN
-- After a rank shift all addresses containing us must be updated.
UPDATE placex p SET indexed_status = 2 FROM place_addressline pa
WHERE pa.address_place_id = NEW.place_id and p.place_id = pa.place_id
and p.indexed_status = 0 and p.rank_address between 4 and 25;
END IF;
{% endif %}
IF NEW.admin_level = 2
AND NEW.class = 'boundary' AND NEW.type = 'administrative'
AND NEW.country_code IS NOT NULL AND NEW.osm_type = 'R'
@@ -1191,7 +1230,11 @@ BEGIN
{% endif %}
END IF;
IF NEW.postcode is null AND NEW.rank_search > 8 THEN
IF NEW.postcode is null AND NEW.rank_search > 8
AND (NEW.rank_address > 0
OR ST_GeometryType(NEW.geometry) not in ('ST_LineString','ST_MultiLineString')
OR ST_Length(NEW.geometry) < 0.02)
THEN
NEW.postcode := get_nearest_postcode(NEW.country_code, NEW.geometry);
END IF;

View File

@@ -273,8 +273,8 @@ BEGIN
END IF;
RETURN ST_Envelope(ST_Collect(
ST_Project(geom, radius, 0.785398)::geometry,
ST_Project(geom, radius, 3.9269908)::geometry));
ST_Project(geom::geography, radius, 0.785398)::geometry,
ST_Project(geom::geography, radius, 3.9269908)::geometry));
END;
$$
LANGUAGE plpgsql IMMUTABLE;
@@ -429,9 +429,10 @@ BEGIN
SELECT osm_type, osm_id, class, type FROM placex WHERE place_id = placeid INTO osmtype, osmid, pclass, ptype;
DELETE FROM import_polygon_delete where osm_type = osmtype and osm_id = osmid and class = pclass and type = ptype;
DELETE FROM import_polygon_error where osm_type = osmtype and osm_id = osmid and class = pclass and type = ptype;
-- force delete from place/placex by making it a very small geometry
UPDATE place set geometry = ST_SetSRID(ST_Point(0,0), 4326) where osm_type = osmtype and osm_id = osmid and class = pclass and type = ptype;
DELETE FROM place where osm_type = osmtype and osm_id = osmid and class = pclass and type = ptype;
-- force delete by directly entering it into the to-be-deleted table
INSERT INTO place_to_be_deleted (osm_type, osm_id, class, type, deferred)
VALUES(osmtype, osmid, pclass, ptype, false);
PERFORM flush_deleted_places();
RETURN TRUE;
END;

View File

@@ -10,65 +10,86 @@
CREATE INDEX IF NOT EXISTS idx_place_addressline_address_place_id
ON place_addressline USING BTREE (address_place_id) {{db.tablespace.search_index}};
---
CREATE INDEX IF NOT EXISTS idx_placex_rank_search
ON placex USING BTREE (rank_search) {{db.tablespace.search_index}};
---
CREATE INDEX IF NOT EXISTS idx_placex_rank_address
ON placex USING BTREE (rank_address) {{db.tablespace.search_index}};
---
CREATE INDEX IF NOT EXISTS idx_placex_parent_place_id
ON placex USING BTREE (parent_place_id) {{db.tablespace.search_index}}
WHERE parent_place_id IS NOT NULL;
---
CREATE INDEX IF NOT EXISTS idx_placex_geometry ON placex
USING GIST (geometry) {{db.tablespace.search_index}};
---
CREATE INDEX IF NOT EXISTS idx_placex_geometry_reverse_lookupPolygon
ON placex USING gist (geometry) {{db.tablespace.search_index}}
WHERE St_GeometryType(geometry) in ('ST_Polygon', 'ST_MultiPolygon')
AND rank_address between 4 and 25 AND type != 'postcode'
AND name is not null AND indexed_status = 0 AND linked_place_id is null;
---
CREATE INDEX IF NOT EXISTS idx_osmline_parent_place_id
ON location_property_osmline USING BTREE (parent_place_id) {{db.tablespace.search_index}}
WHERE parent_place_id is not null;
---
CREATE INDEX IF NOT EXISTS idx_osmline_parent_osm_id
ON location_property_osmline USING BTREE (osm_id) {{db.tablespace.search_index}};
---
CREATE INDEX IF NOT EXISTS idx_postcode_postcode
ON location_postcode USING BTREE (postcode) {{db.tablespace.search_index}};
{% if drop %}
---
DROP INDEX IF EXISTS idx_placex_geometry_address_area_candidates;
DROP INDEX IF EXISTS idx_placex_geometry_buildings;
DROP INDEX IF EXISTS idx_placex_geometry_lower_rank_ways;
DROP INDEX IF EXISTS idx_placex_wikidata;
DROP INDEX IF EXISTS idx_placex_rank_address_sector;
DROP INDEX IF EXISTS idx_placex_rank_boundaries_sector;
{% else %}
-- Indices only needed for updating.
{% if not drop %}
CREATE INDEX IF NOT EXISTS idx_placex_pendingsector
ON placex USING BTREE (rank_address,geometry_sector) {{db.tablespace.address_index}}
WHERE indexed_status > 0;
---
CREATE INDEX IF NOT EXISTS idx_location_area_country_place_id
ON location_area_country USING BTREE (place_id) {{db.tablespace.address_index}};
---
CREATE UNIQUE INDEX IF NOT EXISTS idx_place_osm_unique
ON place USING btree(osm_id, osm_type, class, type) {{db.tablespace.address_index}};
---
-- Table needed for running updates with osm2pgsql on place.
CREATE TABLE IF NOT EXISTS place_to_be_deleted (
osm_type CHAR(1),
osm_id BIGINT,
class TEXT,
type TEXT,
deferred BOOLEAN
);
{% endif %}
-- Indices only needed for search.
{% if 'search_name' in db.tables %}
---
CREATE INDEX IF NOT EXISTS idx_search_name_nameaddress_vector
ON search_name USING GIN (nameaddress_vector) WITH (fastupdate = off) {{db.tablespace.search_index}};
---
CREATE INDEX IF NOT EXISTS idx_search_name_name_vector
ON search_name USING GIN (name_vector) WITH (fastupdate = off) {{db.tablespace.search_index}};
---
CREATE INDEX IF NOT EXISTS idx_search_name_centroid
ON search_name USING GIST (centroid) {{db.tablespace.search_index}};
{% if postgres.has_index_non_key_column %}
---
CREATE INDEX IF NOT EXISTS idx_placex_housenumber
ON placex USING btree (parent_place_id)
INCLUDE (housenumber) {{db.tablespace.search_index}}
WHERE housenumber is not null;
---
CREATE INDEX IF NOT EXISTS idx_osmline_parent_osm_id_with_hnr
ON location_property_osmline USING btree(parent_place_id)
INCLUDE (startnumber, endnumber) {{db.tablespace.search_index}}
WHERE startnumber is not null;
{% endif %}
{% endif %}

View File

@@ -137,7 +137,9 @@ CREATE TABLE place_addressline (
) {{db.tablespace.search_data}};
CREATE INDEX idx_place_addressline_place_id on place_addressline USING BTREE (place_id) {{db.tablespace.search_index}};
drop table if exists placex;
--------- PLACEX - storage for all indexed places -----------------
DROP TABLE IF EXISTS placex;
CREATE TABLE placex (
place_id BIGINT NOT NULL,
parent_place_id BIGINT,
@@ -157,20 +159,66 @@ CREATE TABLE placex (
postcode TEXT,
centroid GEOMETRY(Geometry, 4326)
) {{db.tablespace.search_data}};
CREATE UNIQUE INDEX idx_place_id ON placex USING BTREE (place_id) {{db.tablespace.search_index}};
CREATE INDEX idx_placex_osmid ON placex USING BTREE (osm_type, osm_id) {{db.tablespace.search_index}};
CREATE INDEX idx_placex_linked_place_id ON placex USING BTREE (linked_place_id) {{db.tablespace.address_index}} WHERE linked_place_id IS NOT NULL;
CREATE INDEX idx_placex_rank_search ON placex USING BTREE (rank_search, geometry_sector) {{db.tablespace.address_index}};
CREATE INDEX idx_placex_geometry ON placex USING GIST (geometry) {{db.tablespace.search_index}};
{% for osm_type in ('N', 'W', 'R') %}
CREATE INDEX idx_placex_osmid_{{osm_type | lower}} ON placex
USING BTREE (osm_id) {{db.tablespace.search_index}}
WHERE osm_type = '{{osm_type}}';
{% endfor %}
-- Usage: - removing linkage status on update
-- - lookup linked places for /details
CREATE INDEX idx_placex_linked_place_id ON placex
USING BTREE (linked_place_id) {{db.tablespace.address_index}}
WHERE linked_place_id IS NOT NULL;
-- Usage: - check that admin boundaries do not overtake each other rank-wise
-- - check that place node in a admin boundary with the same address level
-- - boundary is not completely contained in a place area
-- - parenting of large-area or unparentable features
CREATE INDEX idx_placex_geometry_address_area_candidates ON placex
USING gist (geometry) {{db.tablespace.address_index}}
WHERE rank_address between 1 and 25
and ST_GeometryType(geometry) in ('ST_Polygon','ST_MultiPolygon');
-- Usage: - POI is within building with housenumber
CREATE INDEX idx_placex_geometry_buildings ON placex
USING {{postgres.spgist_geom}} (geometry) {{db.tablespace.search_index}}
USING {{postgres.spgist_geom}} (geometry) {{db.tablespace.address_index}}
WHERE address is not null and rank_search = 30
and ST_GeometryType(geometry) in ('ST_Polygon','ST_MultiPolygon');
-- Usage: - linking of similar named places to boundaries
-- - linking of place nodes with same type to boundaries
-- - lookupPolygon()
CREATE INDEX idx_placex_geometry_placenode ON placex
USING {{postgres.spgist_geom}} (geometry) {{db.tablespace.search_index}}
USING {{postgres.spgist_geom}} (geometry) {{db.tablespace.address_index}}
WHERE osm_type = 'N' and rank_search < 26
and class = 'place' and type != 'postcode' and linked_place_id is null;
CREATE INDEX idx_placex_wikidata on placex USING BTREE ((extratags -> 'wikidata')) {{db.tablespace.address_index}} WHERE extratags ? 'wikidata' and class = 'place' and osm_type = 'N' and rank_search < 26;
and class = 'place' and type != 'postcode';
-- Usage: - is node part of a way?
-- - find parent of interpolation spatially
CREATE INDEX idx_placex_geometry_lower_rank_ways ON placex
USING {{postgres.spgist_geom}} (geometry) {{db.tablespace.address_index}}
WHERE osm_type = 'W' and rank_search >= 26;
-- Usage: - linking place nodes by wikidata tag to boundaries
CREATE INDEX idx_placex_wikidata on placex
USING BTREE ((extratags -> 'wikidata')) {{db.tablespace.address_index}}
WHERE extratags ? 'wikidata' and class = 'place'
and osm_type = 'N' and rank_search < 26;
-- The following two indexes function as a todo list for indexing.
CREATE INDEX idx_placex_rank_address_sector ON placex
USING BTREE (rank_address, geometry_sector) {{db.tablespace.address_index}}
WHERE indexed_status > 0;
CREATE INDEX idx_placex_rank_boundaries_sector ON placex
USING BTREE (rank_search, geometry_sector) {{db.tablespace.address_index}}
WHERE class = 'boundary' and type = 'administrative'
and indexed_status > 0;
DROP SEQUENCE IF EXISTS seq_place;
CREATE SEQUENCE seq_place start 1;
@@ -228,7 +276,7 @@ CREATE SEQUENCE file start 1;
-- null table so it won't error
-- deliberately no drop - importing the table is expensive and static, if it is already there better to avoid removing it
CREATE TABLE wikipedia_article (
CREATE TABLE IF NOT EXISTS wikipedia_article (
language text NOT NULL,
title text NOT NULL,
langcount integer,
@@ -242,15 +290,12 @@ CREATE TABLE wikipedia_article (
wd_page_title text,
instance_of text
);
ALTER TABLE ONLY wikipedia_article ADD CONSTRAINT wikipedia_article_pkey PRIMARY KEY (language, title);
CREATE INDEX idx_wikipedia_article_osm_id ON wikipedia_article USING btree (osm_type, osm_id);
CREATE TABLE wikipedia_redirect (
CREATE TABLE IF NOT EXISTS wikipedia_redirect (
language text,
from_title text,
to_title text
);
ALTER TABLE ONLY wikipedia_redirect ADD CONSTRAINT wikipedia_redirect_pkey PRIMARY KEY (language, from_title);
-- osm2pgsql does not create indexes on the middle tables for Nominatim
-- Add one for lookup of associated street relations.

View File

@@ -1,6 +1,6 @@
# just use the pgxs makefile
foreach(suffix ${PostgreSQL_ADDITIONAL_VERSIONS} "14" "13" "12" "11" "10" "9.6")
foreach(suffix ${PostgreSQL_ADDITIONAL_VERSIONS} "15" "14" "13" "12" "11" "10" "9.6")
list(APPEND PG_CONFIG_HINTS
"/usr/pgsql-${suffix}/bin")
endforeach()

View File

@@ -76,21 +76,25 @@ class UpdateAddData:
osm2pgsql_params = args.osm2pgsql_options(default_cache=1000, default_threads=1)
if args.file or args.diff:
return add_osm_data.add_data_from_file(cast(str, args.file or args.diff),
return add_osm_data.add_data_from_file(args.config.get_libpq_dsn(),
cast(str, args.file or args.diff),
osm2pgsql_params)
if args.node:
return add_osm_data.add_osm_object('node', args.node,
return add_osm_data.add_osm_object(args.config.get_libpq_dsn(),
'node', args.node,
args.use_main_api,
osm2pgsql_params)
if args.way:
return add_osm_data.add_osm_object('way', args.way,
return add_osm_data.add_osm_object(args.config.get_libpq_dsn(),
'way', args.way,
args.use_main_api,
osm2pgsql_params)
if args.relation:
return add_osm_data.add_osm_object('relation', args.relation,
return add_osm_data.add_osm_object(args.config.get_libpq_dsn(),
'relation', args.relation,
args.use_main_api,
osm2pgsql_params)

View File

@@ -20,6 +20,7 @@ from nominatim.clicmd.args import NominatimArgs
LOG = logging.getLogger()
class AdminFuncs:
"""\
Analyse and maintain the database.
@@ -36,6 +37,8 @@ class AdminFuncs:
help='Migrate the database to a new software version')
objs.add_argument('--analyse-indexing', action='store_true',
help='Print performance analysis of the indexing process')
objs.add_argument('--collect-os-info', action="store_true",
help="Generate a report about the host system information")
group = parser.add_argument_group('Arguments for cache warming')
group.add_argument('--search-only', action='store_const', dest='target',
const='search',
@@ -70,8 +73,13 @@ class AdminFuncs:
from ..tools import migration
return migration.migrate(args.config, args)
return 1
if args.collect_os_info:
LOG.warning("Reporting System Information")
from ..tools import collect_os_info
collect_os_info.report_system_information(args.config)
return 0
return 1
def _warm(self, args: NominatimArgs) -> int:
LOG.warning('Warming database caches')

View File

@@ -248,9 +248,9 @@ class APIDetails:
if args.node:
params = dict(osmtype='N', osmid=args.node)
elif args.way:
params = dict(osmtype='W', osmid=args.node)
params = dict(osmtype='W', osmid=args.way)
elif args.relation:
params = dict(osmtype='R', osmid=args.node)
params = dict(osmtype='R', osmid=args.relation)
else:
params = dict(place_id=args.place_id)
if args.object_class:

View File

@@ -76,6 +76,7 @@ class NominatimArgs:
warm: bool
check_database: bool
migrate: bool
collect_os_info: bool
analyse_indexing: bool
target: Optional[str]
osm_id: Optional[str]
@@ -114,6 +115,7 @@ class NominatimArgs:
address_levels: bool
functions: bool
wiki_data: bool
secondary_importance: bool
importance: bool
website: bool
diffs: bool
@@ -182,8 +184,10 @@ class NominatimArgs:
return dict(osm2pgsql=self.config.OSM2PGSQL_BINARY or self.osm2pgsql_path,
osm2pgsql_cache=self.osm2pgsql_cache or default_cache,
osm2pgsql_style=self.config.get_import_style_file(),
osm2pgsql_style_path=self.config.config_dir,
threads=self.threads or default_threads,
dsn=self.config.get_libpq_dsn(),
forward_dependencies=self.config.get_bool('UPDATE_FORWARD_DEPENDENCIES'),
flatnode_file=str(self.config.get_path('FLATNODE_FILE') or ''),
tablespaces=dict(slim_data=self.config.TABLESPACE_OSM_DATA,
slim_index=self.config.TABLESPACE_OSM_INDEX,

View File

@@ -63,6 +63,8 @@ class UpdateRefresh:
help='Update the PL/pgSQL functions in the database')
group.add_argument('--wiki-data', action='store_true',
help='Update Wikipedia/data importance numbers')
group.add_argument('--secondary-importance', action='store_true',
help='Update secondary importance raster data')
group.add_argument('--importance', action='store_true',
help='Recompute place importances (expensive!)')
group.add_argument('--website', action='store_true',
@@ -83,7 +85,7 @@ class UpdateRefresh:
help='Enable debug warning statements in functions')
def run(self, args: NominatimArgs) -> int: #pylint: disable=too-many-branches
def run(self, args: NominatimArgs) -> int: #pylint: disable=too-many-branches, too-many-statements
from ..tools import refresh, postcodes
from ..indexer.indexer import Indexer
@@ -115,6 +117,20 @@ class UpdateRefresh:
with connect(args.config.get_libpq_dsn()) as conn:
refresh.load_address_levels_from_config(conn, args.config)
# Attention: must come BEFORE functions
if args.secondary_importance:
with connect(args.config.get_libpq_dsn()) as conn:
# If the table did not exist before, then the importance code
# needs to be enabled.
if not conn.table_exists('secondary_importance'):
args.functions = True
LOG.warning('Import secondary importance raster data from %s', args.project_dir)
if refresh.import_secondary_importance(args.config.get_libpq_dsn(),
args.project_dir) > 0:
LOG.fatal('FATAL: Cannot update sendary importance raster data')
return 1
if args.functions:
LOG.warning('Create functions')
with connect(args.config.get_libpq_dsn()) as conn:

View File

@@ -76,7 +76,8 @@ class UpdateReplication:
LOG.warning("Initialising replication updates")
with connect(args.config.get_libpq_dsn()) as conn:
replication.init_replication(conn, base_url=args.config.REPLICATION_URL)
replication.init_replication(conn, base_url=args.config.REPLICATION_URL,
socket_timeout=args.socket_timeout)
if args.update_functions:
LOG.warning("Create functions")
refresh.create_functions(conn, args.config, True, False)
@@ -87,7 +88,8 @@ class UpdateReplication:
from ..tools import replication
with connect(args.config.get_libpq_dsn()) as conn:
return replication.check_for_updates(conn, base_url=args.config.REPLICATION_URL)
return replication.check_for_updates(conn, base_url=args.config.REPLICATION_URL,
socket_timeout=args.socket_timeout)
def _report_update(self, batchdate: dt.datetime,
@@ -148,7 +150,7 @@ class UpdateReplication:
while True:
with connect(args.config.get_libpq_dsn()) as conn:
start = dt.datetime.now(dt.timezone.utc)
state = replication.update(conn, params)
state = replication.update(conn, params, socket_timeout=args.socket_timeout)
if state is not replication.UpdateState.NO_CHANGES:
status.log_status(conn, start, 'import')
batchdate, _, _ = status.get_status(conn)

View File

@@ -15,7 +15,7 @@ from pathlib import Path
import psutil
from nominatim.config import Configuration
from nominatim.db.connection import connect, Connection
from nominatim.db.connection import connect
from nominatim.db import status, properties
from nominatim.tokenizer.base import AbstractTokenizer
from nominatim.version import version_str
@@ -59,7 +59,7 @@ class SetupAll:
help="Do not keep tables that are only needed for "
"updating the database later")
group2.add_argument('--offline', action='store_true',
help="Do not attempt to load any additional data from the internet")
help="Do not attempt to load any additional data from the internet")
group3 = parser.add_argument_group('Expert options')
group3.add_argument('--ignore-errors', action='store_true',
help='Continue import even when errors in SQL are present')
@@ -72,6 +72,8 @@ class SetupAll:
from ..tools import database_import, refresh, postcodes, freeze
from ..indexer.indexer import Indexer
num_threads = args.threads or psutil.cpu_count() or 1
country_info.setup_country_config(args.config)
if args.continue_at is None:
@@ -94,14 +96,21 @@ class SetupAll:
drop=args.no_updates,
ignore_errors=args.ignore_errors)
self._setup_tables(args.config, args.reverse_only)
LOG.warning('Importing wikipedia importance data')
data_path = Path(args.config.WIKIPEDIA_DATA_PATH or args.project_dir)
if refresh.import_wikipedia_articles(args.config.get_libpq_dsn(),
data_path) > 0:
LOG.error('Wikipedia importance dump file not found. '
'Will be using default importances.')
'Calculating importance values of locations will not '
'use Wikipedia importance data.')
LOG.warning('Importing secondary importance raster data')
if refresh.import_secondary_importance(args.config.get_libpq_dsn(),
args.project_dir) != 0:
LOG.error('Secondary importance file not imported. '
'Falling back to default ranking.')
self._setup_tables(args.config, args.reverse_only)
if args.continue_at is None or args.continue_at == 'load-data':
LOG.warning('Initialise tables')
@@ -109,8 +118,7 @@ class SetupAll:
database_import.truncate_data_tables(conn)
LOG.warning('Load data into placex table')
database_import.load_data(args.config.get_libpq_dsn(),
args.threads or psutil.cpu_count() or 1)
database_import.load_data(args.config.get_libpq_dsn(), num_threads)
LOG.warning("Setting up tokenizer")
tokenizer = self._get_tokenizer(args.continue_at, args.config)
@@ -121,18 +129,15 @@ class SetupAll:
args.project_dir, tokenizer)
if args.continue_at is None or args.continue_at in ('load-data', 'indexing'):
if args.continue_at is not None and args.continue_at != 'load-data':
with connect(args.config.get_libpq_dsn()) as conn:
self._create_pending_index(conn, args.config.TABLESPACE_ADDRESS_INDEX)
LOG.warning('Indexing places')
indexer = Indexer(args.config.get_libpq_dsn(), tokenizer,
args.threads or psutil.cpu_count() or 1)
indexer = Indexer(args.config.get_libpq_dsn(), tokenizer, num_threads)
indexer.index_full(analyse=not args.index_noanalyse)
LOG.warning('Post-process tables')
with connect(args.config.get_libpq_dsn()) as conn:
database_import.create_search_indices(conn, args.config,
drop=args.no_updates)
drop=args.no_updates,
threads=num_threads)
LOG.warning('Create search index for default country names.')
country_info.create_country_names(conn, tokenizer,
args.config.get_str_list('LANGUAGES'))
@@ -188,27 +193,6 @@ class SetupAll:
return tokenizer_factory.get_tokenizer_for_db(config)
def _create_pending_index(self, conn: Connection, tablespace: str) -> None:
""" Add a supporting index for finding places still to be indexed.
This index is normally created at the end of the import process
for later updates. When indexing was partially done, then this
index can greatly improve speed going through already indexed data.
"""
if conn.index_exists('idx_placex_pendingsector'):
return
with conn.cursor() as cur:
LOG.warning('Creating support index')
if tablespace:
tablespace = 'TABLESPACE ' + tablespace
cur.execute(f"""CREATE INDEX idx_placex_pendingsector
ON placex USING BTREE (rank_address,geometry_sector)
{tablespace} WHERE indexed_status > 0
""")
conn.commit()
def _finalize_database(self, dsn: str, offline: bool) -> None:
""" Determine the database date and set the status accordingly.
"""

View File

@@ -69,8 +69,8 @@ class DBConnection:
self.current_params: Optional[Sequence[Any]] = None
self.ignore_sql_errors = ignore_sql_errors
self.conn: Optional['psycopg2.connection'] = None
self.cursor: Optional['psycopg2.cursor'] = None
self.conn: Optional['psycopg2._psycopg.connection'] = None
self.cursor: Optional['psycopg2._psycopg.cursor'] = None
self.connect(cursor_factory=cursor_factory)
def close(self) -> None:
@@ -78,7 +78,7 @@ class DBConnection:
"""
if self.conn is not None:
if self.cursor is not None:
self.cursor.close() # type: ignore[no-untyped-call]
self.cursor.close()
self.cursor = None
self.conn.close()
@@ -94,7 +94,8 @@ class DBConnection:
# Use a dict to hand in the parameters because async is a reserved
# word in Python3.
self.conn = psycopg2.connect(**{'dsn': self.dsn, 'async': True})
self.conn = psycopg2.connect(**{'dsn': self.dsn, 'async': True}) # type: ignore
assert self.conn
self.wait()
if cursor_factory is not None:

View File

@@ -31,7 +31,7 @@ class Cursor(psycopg2.extras.DictCursor):
""" Query execution that logs the SQL query when debugging is enabled.
"""
if LOG.isEnabledFor(logging.DEBUG):
LOG.debug(self.mogrify(query, args).decode('utf-8')) # type: ignore[no-untyped-call]
LOG.debug(self.mogrify(query, args).decode('utf-8'))
super().execute(query, args)
@@ -55,7 +55,7 @@ class Cursor(psycopg2.extras.DictCursor):
if self.rowcount != 1:
raise RuntimeError("Query did not return a single row.")
result = self.fetchone() # type: ignore[no-untyped-call]
result = self.fetchone()
assert result is not None
return result[0]
@@ -131,7 +131,7 @@ class Connection(psycopg2.extensions.connection):
return False
if table is not None:
row = cur.fetchone() # type: ignore[no-untyped-call]
row = cur.fetchone()
if row is None or not isinstance(row[0], str):
return False
return row[0] == table
@@ -189,7 +189,7 @@ def connect(dsn: str) -> ConnectionContext:
try:
conn = psycopg2.connect(dsn, connection_factory=Connection)
ctxmgr = cast(ConnectionContext, contextlib.closing(conn))
ctxmgr.connection = cast(Connection, conn)
ctxmgr.connection = conn
return ctxmgr
except psycopg2.OperationalError as err:
raise UsageError(f"Cannot connect to database: {err}") from err
@@ -236,7 +236,7 @@ def get_pg_env(dsn: str,
"""
env = dict(base_env if base_env is not None else os.environ)
for param, value in psycopg2.extensions.parse_dsn(dsn).items(): # type: ignore
for param, value in psycopg2.extensions.parse_dsn(dsn).items():
if param in _PG_CONNECTION_STRINGS:
env[_PG_CONNECTION_STRINGS[param]] = value
else:

View File

@@ -41,4 +41,7 @@ def get_property(conn: Connection, name: str) -> Optional[str]:
if cur.rowcount == 0:
return None
return cast(Optional[str], cur.fetchone()[0]) # type: ignore[no-untyped-call]
result = cur.fetchone()
assert result is not None
return cast(Optional[str], result[0])

View File

@@ -11,6 +11,7 @@ from typing import Set, Dict, Any
import jinja2
from nominatim.db.connection import Connection
from nominatim.db.async_connection import WorkerPool
from nominatim.config import Configuration
def _get_partitions(conn: Connection) -> Set[int]:
@@ -96,3 +97,21 @@ class SQLPreprocessor:
with conn.cursor() as cur:
cur.execute(sql)
conn.commit()
def run_parallel_sql_file(self, dsn: str, name: str, num_threads: int = 1,
**kwargs: Any) -> None:
""" Execure the given SQL files using parallel asynchronous connections.
The keyword arguments may supply additional parameters for
preprocessing.
After preprocessing the SQL code is cut at lines containing only
'---'. Each chunk is sent to one of the `num_threads` workers.
"""
sql = self.env.get_template(name).render(**kwargs)
parts = sql.split('\n---\n')
with WorkerPool(dsn, num_threads) as pool:
for part in parts:
pool.next_free_worker().perform(part)

View File

@@ -90,7 +90,7 @@ def get_status(conn: Connection) -> Tuple[Optional[dt.datetime], Optional[int],
if cur.rowcount < 1:
return None, None, None
row = cast(StatusRow, cur.fetchone()) # type: ignore[no-untyped-call]
row = cast(StatusRow, cur.fetchone())
return row['lastimportdate'], row['sequence_id'], row['indexed']

View File

@@ -118,4 +118,4 @@ class CopyBuffer:
"""
if self.buffer.tell() > 0:
self.buffer.seek(0)
cur.copy_from(self.buffer, table, columns=columns) # type: ignore[no-untyped-call]
cur.copy_from(self.buffer, table, columns=columns)

View File

@@ -128,58 +128,64 @@ class Indexer:
with conn.cursor() as cur:
cur.execute('ANALYZE')
self.index_by_rank(0, 4)
_analyze()
if self.index_by_rank(0, 4) > 0:
_analyze()
self.index_boundaries(0, 30)
_analyze()
if self.index_boundaries(0, 30) > 100:
_analyze()
self.index_by_rank(5, 25)
_analyze()
if self.index_by_rank(5, 25) > 100:
_analyze()
self.index_by_rank(26, 30)
_analyze()
if self.index_by_rank(26, 30) > 1000:
_analyze()
self.index_postcodes()
_analyze()
if self.index_postcodes() > 100:
_analyze()
def index_boundaries(self, minrank: int, maxrank: int) -> None:
def index_boundaries(self, minrank: int, maxrank: int) -> int:
""" Index only administrative boundaries within the given rank range.
"""
total = 0
LOG.warning("Starting indexing boundaries using %s threads",
self.num_threads)
with self.tokenizer.name_analyzer() as analyzer:
for rank in range(max(minrank, 4), min(maxrank, 26)):
self._index(runners.BoundaryRunner(rank, analyzer))
total += self._index(runners.BoundaryRunner(rank, analyzer))
def index_by_rank(self, minrank: int, maxrank: int) -> None:
return total
def index_by_rank(self, minrank: int, maxrank: int) -> int:
""" Index all entries of placex in the given rank range (inclusive)
in order of their address rank.
When rank 30 is requested then also interpolations and
places with address rank 0 will be indexed.
"""
total = 0
maxrank = min(maxrank, 30)
LOG.warning("Starting indexing rank (%i to %i) using %i threads",
minrank, maxrank, self.num_threads)
with self.tokenizer.name_analyzer() as analyzer:
for rank in range(max(1, minrank), maxrank + 1):
self._index(runners.RankRunner(rank, analyzer), 20 if rank == 30 else 1)
total += self._index(runners.RankRunner(rank, analyzer), 20 if rank == 30 else 1)
if maxrank == 30:
self._index(runners.RankRunner(0, analyzer))
self._index(runners.InterpolationRunner(analyzer), 20)
total += self._index(runners.RankRunner(0, analyzer))
total += self._index(runners.InterpolationRunner(analyzer), 20)
return total
def index_postcodes(self) -> None:
def index_postcodes(self) -> int:
"""Index the entries of the location_postcode table.
"""
LOG.warning("Starting indexing postcodes using %s threads", self.num_threads)
self._index(runners.PostcodeRunner(), 20)
return self._index(runners.PostcodeRunner(), 20)
def update_status_table(self) -> None:
@@ -191,7 +197,7 @@ class Indexer:
conn.commit()
def _index(self, runner: runners.Runner, batch: int = 1) -> None:
def _index(self, runner: runners.Runner, batch: int = 1) -> int:
""" Index a single rank or table. `runner` describes the SQL to use
for indexing. `batch` describes the number of objects that
should be processed with a single SQL statement
@@ -233,4 +239,4 @@ class Indexer:
conn.commit()
progress.done()
return progress.done()

View File

@@ -55,7 +55,7 @@ class ProgressLogger:
self.next_info += int(places_per_sec) * self.log_interval
def done(self) -> None:
def done(self) -> int:
""" Print final statistics about the progress.
"""
rank_end_time = datetime.now()
@@ -70,3 +70,5 @@ class ProgressLogger:
LOG.warning("Done %d/%d in %d @ %.3f per second - FINISHED %s\n",
self.done_places, self.total_places, int(diff_seconds),
places_per_sec, self.name)
return self.done_places

View File

@@ -566,8 +566,9 @@ class ICUNameAnalyzer(AbstractAnalyzer):
result = self._cache.housenumbers.get(norm_name, result)
if result[0] is None:
with self.conn.cursor() as cur:
cur.execute("SELECT getorcreate_hnr_id(%s)", (norm_name, ))
result = cur.fetchone()[0], norm_name # type: ignore[no-untyped-call]
hid = cur.scalar("SELECT getorcreate_hnr_id(%s)", (norm_name, ))
result = hid, norm_name
self._cache.housenumbers[norm_name] = result
else:
# Otherwise use the analyzer to determine the canonical name.
@@ -580,9 +581,9 @@ class ICUNameAnalyzer(AbstractAnalyzer):
variants = analyzer.compute_variants(word_id)
if variants:
with self.conn.cursor() as cur:
cur.execute("SELECT create_analyzed_hnr_id(%s, %s)",
(word_id, list(variants)))
result = cur.fetchone()[0], variants[0] # type: ignore[no-untyped-call]
hid = cur.scalar("SELECT create_analyzed_hnr_id(%s, %s)",
(word_id, list(variants)))
result = hid, variants[0]
self._cache.housenumbers[word_id] = result
return result
@@ -665,8 +666,7 @@ class ICUNameAnalyzer(AbstractAnalyzer):
with self.conn.cursor() as cur:
cur.execute("SELECT * FROM getorcreate_full_word(%s, %s)",
(token_id, variants))
full, part = cast(Tuple[int, List[int]],
cur.fetchone()) # type: ignore[no-untyped-call]
full, part = cast(Tuple[int, List[int]], cur.fetchone())
self._cache.names[token_id] = (full, part)

View File

@@ -544,8 +544,9 @@ class _TokenInfo:
with conn.cursor() as cur:
cur.execute("SELECT * FROM create_housenumbers(%s)", (simple_list, ))
self.data['hnr_tokens'], self.data['hnr'] = \
cur.fetchone() # type: ignore[no-untyped-call]
result = cur.fetchone()
assert result is not None
self.data['hnr_tokens'], self.data['hnr'] = result
def set_postcode(self, postcode: str) -> None:
@@ -574,8 +575,7 @@ class _TokenInfo:
cur.execute("""SELECT make_keywords(hstore('name' , %s))::text,
word_ids_from_name(%s)::text""",
(name, name))
return cast(Tuple[List[int], List[int]],
cur.fetchone()) # type: ignore[no-untyped-call]
return cast(Tuple[List[int], List[int]], cur.fetchone())
self.data['place_search'], self.data['place_match'] = \
self.cache.places.get(place, _get_place)
@@ -589,8 +589,7 @@ class _TokenInfo:
cur.execute("""SELECT addr_ids_from_name(%s)::text,
word_ids_from_name(%s)::text""",
(name, name))
return cast(Tuple[List[int], List[int]],
cur.fetchone()) # type: ignore[no-untyped-call]
return cast(Tuple[List[int], List[int]], cur.fetchone())
tokens = {}
for key, value in terms:

View File

@@ -0,0 +1,46 @@
# SPDX-License-Identifier: GPL-2.0-only
#
# This file is part of Nominatim. (https://nominatim.org)
#
# Copyright (C) 2022 by the Nominatim developer community.
# For a full list of authors see the git log.
"""
Sanitizer that preprocesses tags from the TIGER import.
It makes the following changes:
* remove state reference from tiger:county
"""
from typing import Callable
import re
from nominatim.tokenizer.sanitizers.base import ProcessInfo
from nominatim.tokenizer.sanitizers.config import SanitizerConfig
COUNTY_MATCH = re.compile('(.*), [A-Z][A-Z]')
def _clean_tiger_county(obj: ProcessInfo) -> None:
""" Remove the state reference from tiger:county tags.
This transforms a name like 'Hamilton, AL' into 'Hamilton'.
If no state reference is detected at the end, the name is left as is.
"""
if not obj.address:
return
for item in obj.address:
if item.kind == 'tiger' and item.suffix == 'county':
m = COUNTY_MATCH.fullmatch(item.name)
if m:
item.name = m[1]
# Switch kind and suffix, the split left them reversed.
item.kind = 'county'
item.suffix = 'tiger'
return
def create(_: SanitizerConfig) -> Callable[[ProcessInfo], None]:
""" Create a housenumber processing function.
"""
return _clean_tiger_county

View File

@@ -12,23 +12,34 @@ from pathlib import Path
import logging
import urllib
from nominatim.db.connection import connect
from nominatim.tools.exec_utils import run_osm2pgsql, get_url
LOG = logging.getLogger()
def add_data_from_file(fname: str, options: MutableMapping[str, Any]) -> int:
def _run_osm2pgsql(dsn: str, options: MutableMapping[str, Any]) -> None:
run_osm2pgsql(options)
# Handle deletions
with connect(dsn) as conn:
with conn.cursor() as cur:
cur.execute('SELECT flush_deleted_places()')
conn.commit()
def add_data_from_file(dsn: str, fname: str, options: MutableMapping[str, Any]) -> int:
""" Adds data from a OSM file to the database. The file may be a normal
OSM file or a diff file in all formats supported by libosmium.
"""
options['import_file'] = Path(fname)
options['append'] = True
run_osm2pgsql(options)
_run_osm2pgsql(dsn, options)
# No status update. We don't know where the file came from.
return 0
def add_osm_object(osm_type: str, osm_id: int, use_main_api: bool,
def add_osm_object(dsn: str, osm_type: str, osm_id: int, use_main_api: bool,
options: MutableMapping[str, Any]) -> int:
""" Add or update a single OSM object from the latest version of the
API.
@@ -51,6 +62,6 @@ def add_osm_object(osm_type: str, osm_id: int, use_main_api: bool,
options['append'] = True
options['import_data'] = get_url(base_url).encode('utf-8')
run_osm2pgsql(options)
_run_osm2pgsql(dsn, options)
return 0

View File

@@ -49,7 +49,7 @@ def _get_place_info(cursor: Cursor, osm_id: Optional[str],
LOG.fatal("OSM object %s not found in database.", osm_id)
raise UsageError("OSM object not found")
return cast(DictCursorResult, cursor.fetchone()) # type: ignore[no-untyped-call]
return cast(DictCursorResult, cursor.fetchone())
def analyse_indexing(config: Configuration, osm_id: Optional[str] = None,

View File

@@ -114,9 +114,10 @@ def _get_indexes(conn: Connection) -> List[str]:
indexes.extend(('idx_placex_housenumber',
'idx_osmline_parent_osm_id_with_hnr'))
if conn.table_exists('place'):
indexes.extend(('idx_placex_pendingsector',
'idx_location_area_country_place_id',
'idx_place_osm_unique'))
indexes.extend(('idx_location_area_country_place_id',
'idx_place_osm_unique',
'idx_placex_rank_address_sector',
'idx_placex_rank_boundaries_sector'))
return indexes
@@ -199,7 +200,7 @@ def check_tokenizer(_: Connection, config: Configuration) -> CheckResult:
def check_existance_wikipedia(conn: Connection, _: Configuration) -> CheckResult:
""" Checking for wikipedia/wikidata data
"""
if not conn.table_exists('search_name'):
if not conn.table_exists('search_name') or not conn.table_exists('place'):
return CheckState.NOT_APPLICABLE
with conn.cursor() as cur:
@@ -268,7 +269,7 @@ def check_database_index_valid(conn: Connection, _: Configuration) -> CheckResul
WHERE pg_index.indisvalid = false
AND pg_index.indexrelid = pg_class.oid""")
broken = list(cur)
broken = [c[0] for c in cur]
if broken:
return CheckState.FAIL, dict(indexes='\n '.join(broken))

View File

@@ -0,0 +1,166 @@
# SPDX-License-Identifier: GPL-2.0-only
#
# This file is part of Nominatim. (https://nominatim.org)
#
# Copyright (C) 2022 by the Nominatim developer community.
# For a full list of authors see the git log.
"""
Collection of host system information including software versions, memory,
storage, and database configuration.
"""
import os
import subprocess
import sys
from pathlib import Path
from typing import List, Optional, Tuple, Union
import psutil
from psycopg2.extensions import make_dsn, parse_dsn
from nominatim.config import Configuration
from nominatim.db.connection import connect
from nominatim.version import version_str
def convert_version(ver_tup: Tuple[int, int]) -> str:
"""converts tuple version (ver_tup) to a string representation"""
return ".".join(map(str, ver_tup))
def friendly_memory_string(mem: float) -> str:
"""Create a user friendly string for the amount of memory specified as mem"""
mem_magnitude = ("bytes", "KB", "MB", "GB", "TB", "PB", "EB", "ZB", "YB")
mag = 0
# determine order of magnitude
while mem > 1000:
mem /= 1000
mag += 1
return f"{mem:.1f} {mem_magnitude[mag]}"
def run_command(cmd: Union[str, List[str]]) -> str:
"""Runs a command using the shell and returns the output from stdout"""
try:
if sys.version_info < (3, 7):
cap_out = subprocess.run(cmd, stdout=subprocess.PIPE, check=False)
else:
cap_out = subprocess.run(cmd, capture_output=True, check=False)
return cap_out.stdout.decode("utf-8")
except FileNotFoundError:
# non-Linux system should end up here
return f"Unknown (unable to find the '{cmd}' command)"
def os_name_info() -> str:
"""Obtain Operating System Name (and possibly the version)"""
os_info = None
# man page os-release(5) details meaning of the fields
if Path("/etc/os-release").is_file():
os_info = from_file_find_line_portion(
"/etc/os-release", "PRETTY_NAME", "=")
# alternative location
elif Path("/usr/lib/os-release").is_file():
os_info = from_file_find_line_portion(
"/usr/lib/os-release", "PRETTY_NAME", "="
)
# fallback on Python's os name
if os_info is None or os_info == "":
os_info = os.name
# if the above is insufficient, take a look at neofetch's approach to OS detection
return os_info
# Note: Intended to be used on informational files like /proc
def from_file_find_line_portion(
filename: str, start: str, sep: str, fieldnum: int = 1
) -> Optional[str]:
"""open filename, finds the line starting with the 'start' string.
Splits the line using seperator and returns a "fieldnum" from the split."""
with open(filename, encoding='utf8') as file:
result = ""
for line in file:
if line.startswith(start):
result = line.split(sep)[fieldnum].strip()
return result
def get_postgresql_config(version: int) -> str:
"""Retrieve postgres configuration file"""
try:
with open(f"/etc/postgresql/{version}/main/postgresql.conf", encoding='utf8') as file:
db_config = file.read()
file.close()
return db_config
except IOError:
return f"**Could not read '/etc/postgresql/{version}/main/postgresql.conf'**"
def report_system_information(config: Configuration) -> None:
"""Generate a report about the host system including software versions, memory,
storage, and database configuration."""
with connect(make_dsn(config.get_libpq_dsn(), dbname='postgres')) as conn:
postgresql_ver: str = convert_version(conn.server_version_tuple())
with conn.cursor() as cur:
num = cur.scalar("SELECT count(*) FROM pg_catalog.pg_database WHERE datname=%s",
(parse_dsn(config.get_libpq_dsn())['dbname'], ))
nominatim_db_exists = num == 1 if isinstance(num, int) else False
if nominatim_db_exists:
with connect(config.get_libpq_dsn()) as conn:
postgis_ver: str = convert_version(conn.postgis_version_tuple())
else:
postgis_ver = "Unable to connect to database"
postgresql_config: str = get_postgresql_config(int(float(postgresql_ver)))
# Note: psutil.disk_partitions() is similar to run_command("lsblk")
# Note: run_command("systemd-detect-virt") only works on Linux, on other OSes
# should give a message: "Unknown (unable to find the 'systemd-detect-virt' command)"
# Generates the Markdown report.
report = f"""
**Instructions**
Use this information in your issue report at https://github.com/osm-search/Nominatim/issues
Redirect the output to a file:
$ ./collect_os_info.py > report.md
**Software Environment:**
- Python version: {sys.version}
- Nominatim version: {version_str()}
- PostgreSQL version: {postgresql_ver}
- PostGIS version: {postgis_ver}
- OS: {os_name_info()}
**Hardware Configuration:**
- RAM: {friendly_memory_string(psutil.virtual_memory().total)}
- number of CPUs: {psutil.cpu_count(logical=False)}
- bare metal/AWS/other cloud service (per systemd-detect-virt(1)): {run_command("systemd-detect-virt")}
- type and size of disks:
**`df -h` - df - report file system disk space usage: **
```
{run_command(["df", "-h"])}
```
**lsblk - list block devices: **
```
{run_command("lsblk")}
```
**Postgresql Configuration:**
```
{postgresql_config}
```
**Notes**
Please add any notes about anything above anything above that is incorrect.
"""
print(report)

View File

@@ -75,6 +75,11 @@ def setup_database_skeleton(dsn: str, rouser: Optional[str] = None) -> None:
with conn.cursor() as cur:
cur.execute('CREATE EXTENSION IF NOT EXISTS hstore')
cur.execute('CREATE EXTENSION IF NOT EXISTS postgis')
postgis_version = conn.postgis_version_tuple()
if postgis_version[0] >= 3:
cur.execute('CREATE EXTENSION IF NOT EXISTS postgis_raster')
conn.commit()
_require_version('PostGIS',
@@ -95,7 +100,7 @@ def import_osm_data(osm_files: Union[Path, Sequence[Path]],
if not options['flatnode_file'] and options['osm2pgsql_cache'] == 0:
# Make some educated guesses about cache size based on the size
# of the import file and the available memory.
mem = psutil.virtual_memory() # type: ignore[no-untyped-call]
mem = psutil.virtual_memory()
fsize = 0
if isinstance(osm_files, list):
for fname in osm_files:
@@ -225,7 +230,8 @@ def load_data(dsn: str, threads: int) -> None:
cur.execute('ANALYSE')
def create_search_indices(conn: Connection, config: Configuration, drop: bool = False) -> None:
def create_search_indices(conn: Connection, config: Configuration,
drop: bool = False, threads: int = 1) -> None:
""" Create tables that have explicit partitioning.
"""
@@ -243,4 +249,5 @@ def create_search_indices(conn: Connection, config: Configuration, drop: bool =
sql = SQLPreprocessor(conn, config)
sql.run_sql_file(conn, 'indices.sql', drop=drop)
sql.run_parallel_sql_file(config.get_libpq_dsn(),
'indices.sql', min(8, threads), drop=drop)

View File

@@ -10,6 +10,7 @@ Helper functions for executing external programs.
from typing import Any, Union, Optional, Mapping, IO
from pathlib import Path
import logging
import os
import subprocess
import urllib.request as urlrequest
from urllib.parse import urlencode
@@ -116,21 +117,27 @@ def run_osm2pgsql(options: Mapping[str, Any]) -> None:
env = get_pg_env(options['dsn'])
cmd = [str(options['osm2pgsql']),
'--hstore', '--latlon', '--slim',
'--with-forward-dependencies', 'false',
'--log-progress', 'true',
'--number-processes', str(options['threads']),
'--number-processes', '1' if options['append'] else str(options['threads']),
'--cache', str(options['osm2pgsql_cache']),
'--output', 'gazetteer',
'--style', str(options['osm2pgsql_style'])
]
if options['append']:
cmd.append('--append')
if str(options['osm2pgsql_style']).endswith('.lua'):
env['LUA_PATH'] = ';'.join((str(options['osm2pgsql_style_path'] / 'flex-base.lua'),
os.environ.get('LUAPATH', ';')))
cmd.extend(('--output', 'flex'))
else:
cmd.append('--create')
cmd.extend(('--output', 'gazetteer'))
cmd.append('--append' if options['append'] else '--create')
if options['flatnode_file']:
cmd.extend(('--flat-nodes', options['flatnode_file']))
if not options.get('forward_dependencies', False):
cmd.extend(('--with-forward-dependencies', 'false'))
for key, param in (('slim_data', '--tablespace-slim-data'),
('slim_index', '--tablespace-slim-index'),
('main_data', '--tablespace-main-data'),

View File

@@ -315,3 +315,36 @@ def mark_internal_country_names(conn: Connection, config: Configuration, **_: An
names = {}
names['countrycode'] = country_code
analyzer.add_country_names(country_code, names)
@_migration(4, 1, 99, 0)
def add_place_deletion_todo_table(conn: Connection, **_: Any) -> None:
""" Add helper table for deleting data on updates.
The table is only necessary when updates are possible, i.e.
the database is not in freeze mode.
"""
if conn.table_exists('place'):
with conn.cursor() as cur:
cur.execute("""CREATE TABLE IF NOT EXISTS place_to_be_deleted (
osm_type CHAR(1),
osm_id BIGINT,
class TEXT,
type TEXT,
deferred BOOLEAN)""")
@_migration(4, 1, 99, 1)
def split_pending_index(conn: Connection, **_: Any) -> None:
""" Reorganise indexes for pending updates.
"""
if conn.table_exists('place'):
with conn.cursor() as cur:
cur.execute("""CREATE INDEX IF NOT EXISTS idx_placex_rank_address_sector
ON placex USING BTREE (rank_address, geometry_sector)
WHERE indexed_status > 0""")
cur.execute("""CREATE INDEX IF NOT EXISTS idx_placex_rank_boundaries_sector
ON placex USING BTREE (rank_search, geometry_sector)
WHERE class = 'boundary' and type = 'administrative'
and indexed_status > 0""")
cur.execute("DROP INDEX IF EXISTS idx_placex_pendingsector")

View File

@@ -15,7 +15,7 @@ from pathlib import Path
from psycopg2 import sql as pysql
from nominatim.config import Configuration
from nominatim.db.connection import Connection
from nominatim.db.connection import Connection, connect
from nominatim.db.utils import execute_file
from nominatim.db.sql_preprocessor import SQLPreprocessor
from nominatim.version import version_str
@@ -146,6 +146,25 @@ def import_wikipedia_articles(dsn: str, data_path: Path, ignore_errors: bool = F
return 0
def import_secondary_importance(dsn: str, data_path: Path, ignore_errors: bool = False) -> int:
""" Replaces the secondary importance raster data table with new data.
Returns 0 if all was well and 1 if the raster SQL file could not
be found. Throws an exception if there was an error reading the file.
"""
datafile = data_path / 'secondary_importance.sql.gz'
if not datafile.exists():
return 1
with connect(dsn) as conn:
postgis_version = conn.postgis_version_tuple()
if postgis_version[0] < 3:
LOG.error('PostGIS version is too old for using OSM raster data.')
return 2
execute_file(dsn, datafile, ignore_errors=ignore_errors)
return 0
def recompute_importance(conn: Connection) -> None:
""" Recompute wikipedia links and importance for all entries in placex.
@@ -157,7 +176,7 @@ def recompute_importance(conn: Connection) -> None:
cur.execute("""
UPDATE placex SET (wikipedia, importance) =
(SELECT wikipedia, importance
FROM compute_importance(extratags, country_code, osm_type, osm_id))
FROM compute_importance(extratags, country_code, rank_search, centroid))
""")
cur.execute("""
UPDATE placex s SET wikipedia = d.wikipedia, importance = d.importance

View File

@@ -7,13 +7,16 @@
"""
Functions for updating a database from a replication source.
"""
from typing import ContextManager, MutableMapping, Any, Generator, cast
from typing import ContextManager, MutableMapping, Any, Generator, cast, Iterator
from contextlib import contextmanager
import datetime as dt
from enum import Enum
import logging
import time
import types
import urllib.request as urlrequest
import requests
from nominatim.db import status
from nominatim.db.connection import Connection
from nominatim.tools.exec_utils import run_osm2pgsql
@@ -22,6 +25,7 @@ from nominatim.errors import UsageError
try:
from osmium.replication.server import ReplicationServer
from osmium import WriteHandler
from osmium import version as pyo_version
except ImportError as exc:
logging.getLogger().critical("pyosmium not installed. Replication functions not available.\n"
"To install pyosmium via pip: pip3 install osmium")
@@ -29,7 +33,8 @@ except ImportError as exc:
LOG = logging.getLogger()
def init_replication(conn: Connection, base_url: str) -> None:
def init_replication(conn: Connection, base_url: str,
socket_timeout: int = 60) -> None:
""" Set up replication for the server at the given base URL.
"""
LOG.info("Using replication source: %s", base_url)
@@ -38,9 +43,8 @@ def init_replication(conn: Connection, base_url: str) -> None:
# margin of error to make sure we get all data
date -= dt.timedelta(hours=3)
repl = ReplicationServer(base_url)
seq = repl.timestamp_to_sequence(date)
with _make_replication_server(base_url, socket_timeout) as repl:
seq = repl.timestamp_to_sequence(date)
if seq is None:
LOG.fatal("Cannot reach the configured replication service '%s'.\n"
@@ -53,7 +57,8 @@ def init_replication(conn: Connection, base_url: str) -> None:
LOG.warning("Updates initialised at sequence %s (%s)", seq, date)
def check_for_updates(conn: Connection, base_url: str) -> int:
def check_for_updates(conn: Connection, base_url: str,
socket_timeout: int = 60) -> int:
""" Check if new data is available from the replication service at the
given base URL.
"""
@@ -64,7 +69,8 @@ def check_for_updates(conn: Connection, base_url: str) -> int:
"Please run 'nominatim replication --init' first.")
return 254
state = ReplicationServer(base_url).get_state_info()
with _make_replication_server(base_url, socket_timeout) as repl:
state = repl.get_state_info()
if state is None:
LOG.error("Cannot get state for URL %s.", base_url)
@@ -86,7 +92,8 @@ class UpdateState(Enum):
NO_CHANGES = 3
def update(conn: Connection, options: MutableMapping[str, Any]) -> UpdateState:
def update(conn: Connection, options: MutableMapping[str, Any],
socket_timeout: int = 60) -> UpdateState:
""" Update database from the next batch of data. Returns the state of
updates according to `UpdateState`.
"""
@@ -114,7 +121,7 @@ def update(conn: Connection, options: MutableMapping[str, Any]) -> UpdateState:
options['import_file'].unlink()
# Read updates into file.
with _make_replication_server(options['base_url']) as repl:
with _make_replication_server(options['base_url'], socket_timeout) as repl:
outhandler = WriteHandler(str(options['import_file']))
endseq = repl.apply_diffs(outhandler, startseq + 1,
max_size=options['max_diff_size'] * 1024)
@@ -123,10 +130,7 @@ def update(conn: Connection, options: MutableMapping[str, Any]) -> UpdateState:
if endseq is None:
return UpdateState.NO_CHANGES
# Consume updates with osm2pgsql.
options['append'] = True
options['disable_jit'] = conn.server_version_tuple() >= (11, 0)
run_osm2pgsql(options)
run_osm2pgsql_updates(conn, options)
# Write the current status to the file
endstate = repl.get_state_info(endseq)
@@ -136,14 +140,59 @@ def update(conn: Connection, options: MutableMapping[str, Any]) -> UpdateState:
return UpdateState.UP_TO_DATE
def _make_replication_server(url: str) -> ContextManager[ReplicationServer]:
def run_osm2pgsql_updates(conn: Connection, options: MutableMapping[str, Any]) -> None:
""" Run osm2pgsql in append mode.
"""
# Remove any stale deletion marks.
with conn.cursor() as cur:
cur.execute('TRUNCATE place_to_be_deleted')
conn.commit()
# Consume updates with osm2pgsql.
options['append'] = True
options['disable_jit'] = conn.server_version_tuple() >= (11, 0)
run_osm2pgsql(options)
# Handle deletions
with conn.cursor() as cur:
cur.execute('SELECT flush_deleted_places()')
conn.commit()
def _make_replication_server(url: str, timeout: int) -> ContextManager[ReplicationServer]:
""" Returns a ReplicationServer in form of a context manager.
Creates a light wrapper around older versions of pyosmium that did
not support the context manager interface.
"""
if hasattr(ReplicationServer, '__enter__'):
return cast(ContextManager[ReplicationServer], ReplicationServer(url))
# Patches the open_url function for pyosmium >= 3.2
# where the socket timeout is no longer respected.
def patched_open_url(self: ReplicationServer, url: urlrequest.Request) -> Any:
""" Download a resource from the given URL and return a byte sequence
of the content.
"""
headers = {"User-Agent" : f"Nominatim (pyosmium/{pyo_version.pyosmium_release})"}
if self.session is not None:
return self.session.get(url.get_full_url(),
headers=headers, timeout=timeout or None,
stream=True)
@contextmanager
def _get_url_with_session() -> Iterator[requests.Response]:
with requests.Session() as session:
request = session.get(url.get_full_url(),
headers=headers, timeout=timeout or None,
stream=True)
yield request
return _get_url_with_session()
repl = ReplicationServer(url)
setattr(repl, 'open_url', types.MethodType(patched_open_url, repl))
return cast(ContextManager[ReplicationServer], repl)
@contextmanager
def get_cm() -> Generator[ReplicationServer, None, None]:

View File

@@ -2,7 +2,7 @@
#
# This file is part of Nominatim. (https://nominatim.org)
#
# Copyright (C) 2022 by the Nominatim developer community.
# Copyright (C) 2023 by the Nominatim developer community.
# For a full list of authors see the git log.
"""
Version information for Nominatim.
@@ -25,7 +25,7 @@ from typing import Optional, Tuple
# patch level when cherry-picking the commit with the migration.
#
# Released versions always have a database patch level of 0.
NOMINATIM_VERSION = (4, 1, 0, 0)
NOMINATIM_VERSION = (4, 2, 4, 0)
POSTGRESQL_REQUIRED_VERSION = (9, 6)
POSTGIS_REQUIRED_VERSION = (2, 2)

View File

@@ -1,71 +0,0 @@
name:
default: De Nederlandse Antillen
af: Nederlandse Antille
an: Antillas Neerlandesas
ar: جزر الأنتيل
be: Нідэрландскія Антылы
bg: Холандски Антили
br: Antilhez Nederlandat
bs: Holandski Antili
ca: Antilles Neerlandeses
cs: Nizozemské Antily
cy: Antilles yr Iseldiroedd
da: Nederlandske Antiller
de: Niederländische Antillen
dv: ނެދަލޭންޑު އެންޓިލޭ
el: Ολλανδικές Αντίλλες
en: Netherlands Antilles
eo: Nederlandaj Antiloj
es: Antillas Neerlandesas;Antillas Holandesas;Indias Occidentales Holandesas
et: Hollandi Antillid
eu: Holandarren Antillak
fa: آنتیل هلند
fi: Alankomaiden Antillit
fo: Niðurlendsku Antillurnar
fr: Antilles néerlandaises
fy: Nederlânske Antillen
ga: Aintillí na hÍsiltíre
gl: Antillas Neerlandesas
he: האנטילים ההולנדיים
hi: नीदरलैंड एंटीलीज़
hr: Nizozemski Antili
hu: Holland Antillák
ia: Antillas Nederlandese
id: Antillen Belanda
io: Nederlandana Antili
is: Hollensku Antillaeyjar
it: Antille Olandesi
ja: オランダ領アンティル
jv: Antillen Walanda
ka: ნიდერლანდის ანტილები
kk: Антийлер
ko: 네덜란드령 안틸레스
kw: Antillys Iseldiryek
la: Antillae Nederlandiae
lb: Hollännesch Antillen
li: Nederlandse Antille
ln: Antiya ya Holanda
lt: Nyderlandų Antilai
lv: Antiļas
mn: Нидерландын Антиллийн Арлууд
mr: नेदरलँड्स अँटिल्स
ms: Antillen Belanda
nn: Dei nederlandske Antillane
"no": De nederlandske Antillene
pl: Antyle Holenderskie
pt: Antilhas Holandesas
ro: Antilele Olandeze
ru: Нидерландские Антилы
sh: Nizozemski Antili
sk: Holandské Antily
sl: Nizozemski Antili
sr: Холандски Антили
sv: Nederländska Antillerna
sw: Antili za Kiholanzi
ta: நெதர்லாந்து அண்டிலிசு
tg: Антил Ҳоланд
th: เนเธอร์แลนด์แอนทิลลิส
tr: Hollanda Antilleri
uk: Нідерландські Антильські острови
vi: Antille thuộc Hà Lan
zh: 荷属安的列斯

View File

@@ -1,2 +0,0 @@
name:
default: Antarctica

View File

@@ -1,2 +0,0 @@
name:
default: American Samoa

View File

@@ -1,2 +0,0 @@
name:
default: Aruba

View File

@@ -1,2 +0,0 @@
name:
default: Aland Islands

View File

@@ -1,2 +0,0 @@
name:
default: Saint Barthélemy

View File

@@ -1,2 +0,0 @@
name:
default: "\N"

View File

@@ -1,2 +0,0 @@
name:
default: Bouvet Island

View File

@@ -1,37 +0,0 @@
name:
default: Cocos (Keeling) Islands
af: Cocos (Keeling) Eilande
ar: جزر كوكوس (كيلينغ)
be: Какосавыя (Кілінг) астравы
br: Inizi Kokoz
ca: Illes Cocos
da: Cocosøerne
de: Kokosinseln
el: Νησιά Κόκος
en: Cocos (Keeling) Islands
eo: Kokosinsuloj
es: Islas Cocos (Keeling)
et: Kookossaared
eu: Cocos (Keeling) uharteak
fa: جزایر کوکوس
fi: Kookossaaret
fr: Îles Cocos
fy: de Kokoseilannen
he: איי קוקוס (קילינג)
hr: Kokosovi otoci
hu: Kókusz (Keeling)-szigetek
id: Kepulauan Cocos (Keeling)
is: Kókoseyjar
it: Isole Cocos e Keeling
lt: Kokoso (Keelingo) salos
lv: Kokosu (Kīlinga) salas
mn: Кокосын (Кийлингийн) Арлууд
nl: Cocoseilanden
pl: Wyspy Kokosowe
ru: Кокосовые острова
sl: Kokosovi otoki
sv: Kokosöarna
tr: Cocos (Keeling) Adaları
uk: Кокосові острови
vi: Quần đảo Cocos (Keeling)
zh: 科科斯(基林)群島

View File

@@ -1,7 +0,0 @@
name:
default: Curaçao
en: Curaçao
es: Curazao
fr: Curaçao
ru: Кюрасао
sv: Curaçao

View File

@@ -1,61 +0,0 @@
name:
default: Christmas Island
af: Christmas-eiland
ar: جزيرة الميلاد
bg: Рождество
br: Enez Nedeleg
bs: Božićno ostrvo
ca: Illa Christmas
cs: Vánoční ostrov
cy: Ynys y Nadolig
da: Juleøen
de: Weihnachtsinsel
el: Νήσος των Χριστουγέννων
eo: Kristnaskinsulo
es: Isla de Navidad
et: Jõulusaar
eu: Christmas uhartea
fa: جزیره کریسمس
fi: Joulusaari
fr: Île Christmas
fy: Krysteilân
ga: Oileán na Nollag
gl: Illa de Nadal
he: טריטוריית האי חג המולד
hi: क्रिसमस आईलैंड
hr: Božićni otok
hu: Karácsony-sziget
id: Pulau Natal
is: Jólaeyja
it: Isola del Natale
ja: クリスマス島
ka: შობის კუნძული
kk: Кристмас аралы
ko: 크리스마스 섬
kw: Ynys Nadelik
lb: Chrëschtdagsinsel
lt: Kalėdų sala
lv: Ziemsvētku sala
mn: Зул Сарын Арал
mr: क्रिसमस द्वीप
ms: Pulau Krismas
nl: Christmaseiland
nn: Christmasøya
"no": Christmasøya
pl: Wyspa Bożego Narodzenia
pt: Ilha Christmas
ro: Insula Crăciunului
ru: Остров Рождества
sh: Božićni otok
sk: Vianočný ostrov
sl: Božični otoki
sr: Божићно Острво
sv: Julön
sw: Kisiwa cha Krismasi
ta: கிறிஸ்துமசு தீவு
th: เกาะคริสต์มาส
tr: Christmas Adası
uk: Острів Різдва
vi: Đảo Christmas
wo: Dunu Christmas
zh: 圣诞岛

View File

@@ -1,41 +0,0 @@
name:
default: Guyane Française
af: Frans-Guyana
ar: غيانا
br: Gwiana chall
ca: Guaiana Francesa
cy: Guyane
da: Fransk Guyana
de: Französisch-Guayana
el: Γαλλική Γουιάνα
en: French Guiana
eo: Gujano
es: Guayana Francesa
et: Prantsuse Guajaana
fa: گویان فرانسه
fi: Ranskan Guayana
fr: Guyane française
fy: Frânsk Guyana
ga: Guáin na Fraince
gd: Guiana Fhrangach
he: גיאנה הצרפתית
hr: Francuska Gvajana
hu: Francia Guyana
id: Guyana Perancis
is: Franska Gvæjana
it: Guyana francese
la: Guiana Francica
li: Frans Guyana
lt: Prancūzijos Gviana
lv: Franču Gviāna
mn: Франц Гвиана
nl: Frans-Guyana
pl: Gujana Francuska
ru: Французская Гвиана
sl: Francoska Gvajana
sv: Franska Guyana
th: เฟรนช์เกียนา
tr: Fransız Guyanası
uk: Французька Гвіана
vi: Guyane thuộc Pháp
zh: 法属圭亚那

View File

@@ -1,31 +0,0 @@
name:
default: Guadeloupe
ar: غوادلوب
be: Гвадэлупа
br: Gwadeloup
ca: Illa de Guadalupe
da: Guadeloupe
el: Γουαδελούπη
en: Guadeloupe
eo: Gvadelupo
es: Guadalupe
fa: گوادلوپ
fi: Guadeloupe
fr: Guadeloupe
fy: Guadelûp
ga: Guadalúip
he: גוואדלופ
hr: Gvadalupa
hu: Guadeloupe
is: Gvadelúpeyjar
it: Guadalupa
la: Guadalupa
lt: Gvadelupa
lv: Gvadelupa
mn: Гуаделупе
pl: Gwadelupa
ru: Гваделупа
sv: Guadeloupe
th: กวาเดอลูป
uk: Гваделупа
zh: 瓜德罗普

View File

@@ -1,2 +0,0 @@
name:
default: Guam

View File

@@ -1,2 +0,0 @@
name:
default: Hong Kong

View File

@@ -1,2 +0,0 @@
name:
default: Heard Island and MaxDonald Islands

View File

@@ -1,2 +0,0 @@
name:
default: Saint Martin

View File

@@ -1,2 +0,0 @@
name:
default: Macao

View File

@@ -1,2 +0,0 @@
name:
default: Northern Mariana Islands

View File

@@ -1,30 +0,0 @@
name:
default: Martinique
ar: مارتينيك
be: Марцініка
br: Martinik
ca: Martinica
da: Martinique
el: Μαρτινίκα
en: Martinique
eo: Martiniko
es: Martinica
fa: مارتینیک
fi: Martinique
fr: Martinique
fy: Martinyk
he: מרטיניק
hr: Martinik
hu: Martinique
id: Martinik
is: Martinique
it: Martinica
la: Martinica
lt: Martinika
lv: Martinika
mn: Мартиник
pl: Martynika
ru: Мартиника
sv: Martinique
uk: Мартиніка
zh: 馬提尼克

View File

@@ -1,37 +0,0 @@
name:
default: Nouvelle-Calédonie
af: Nieu-Caledonia
ar: كاليدونيا الجديدة
be: Новая Каледонія
br: Kaledonia Nevez
ca: Nova Caledònia
cy: Caledonia Newydd
da: Ny Kaledonien
de: Neukaledonien
el: Νέα Καληδονία
en: New Caledonia
eo: Nov-Kaledonio
es: Nueva Caledonia
fa: کالدونیای جدید
fi: Uusi-Kaledonia
fr: Nouvelle-Calédonie
ga: An Nua-Chaladóin
he: קלדוניה החדשה
hr: Nova Kaledonija
hu: Új-Kaledónia
id: Kaledonia Baru
is: Nýja-Kaledónía
it: Nuova Caledonia
la: Nova Caledonia
lt: Naujoji Kaledonija
lv: Jaunkaledonija
mn: Шинэ Каледони
nl: Nieuw-Caledonië
pl: Nowa Kaledonia
ru: Новая Каледония
sl: Nova Kaledonija
sv: Nya Kaledonien
th: นิวแคลิโดเนีย
tr: Yeni Kaledonya
uk: Нова Каледонія
zh: 新喀里多尼亚

View File

@@ -1,36 +0,0 @@
name:
default: Norfolk Island
af: Norfolkeiland
ar: جزيرة نورفولك
be: Норфалк
br: Enez Norfolk
ca: Illa Norfolk
cy: Ynys Norfolk
da: Norfolk-øen
de: Norfolkinsel
en: Norfolk Island
eo: Norfolkinsulo
es: Isla Norfolk
et: Norfolki saar
fi: Norfolkinsaari
fr: Île Norfolk
fy: Norfolk
ga: Oileán Norfolk
he: האי נורפוק
hr: Otok Norfolk
hu: Norfolk-sziget
id: Pulau Norfolk
is: Norfolkeyja
it: Isola Norfolk
la: Insula Norfolcia
lt: Norfolko sala
lv: Norfolkas sala
mn: Норфолк Арал
nl: Norfolk
pl: Wyspa Norfolk
ru: Остров Норфолк
sv: Norfolkön
tr: Norfolk Adası
uk: Острів Норфолк
vi: Đảo Norfolk
zh: 诺福克岛

View File

@@ -1,77 +0,0 @@
name:
default: Polynésie française
af: Franse Polynesië
an: Polinesia Franzesa
ar: بولونيزيا الفرنسية
az: Fransa Polineziyası
be: Французская Палінезія
bg: Френска Полинезия
br: Polinezia Frañs
bs: Francuska Polinezija
ca: Polinèsia Francesa
cs: Francouzská Polynésie
cy: Polynesia Ffrengig
da: Fransk Polynesien
de: Französisch-Polynesien
dv: ފަރަންސޭސި ޕޮލިނޭޝިއާ
el: Γαλλική Πολυνησία
en: French Polynesia
eo: Franca Polinezio
es: Polinesia Francesa
et: Prantsuse Polüneesia
eu: Frantziar Polinesia
fa: پلی‌نزی فرانسه
fi: Ranskan Polynesia
fr: Polynésie française
fy: Frânsk Polyneezje
ga: Polainéis na Fraince
gd: French Polynesia
gl: Polinesia francesa
he: פולינזיה הצרפתית
hi: फ्रेंच पोलीनेशिया
hr: Francuska Polinezija
hu: Francia Polinézia
id: Polinesia Perancis
io: Franca Polinezia
is: Franska Pólýnesía
it: Polinesia francese
ja: フランス領ポリネシア
jv: Polinesia Perancis
kk: Франция Полинезиясы
ko: 프랑스령 폴리네시아
kw: Polynesi Frynkek
la: Polynesia Francica
lb: Franséisch-Polynesien
lt: Prancūzijos Polinezija
lv: Franču Polinēzija
mi: Porinīhia Wīwī
mk: Француска Полинезија
mn: Францын Полинез
mr: फ्रेंच पॉलिनेशिया
ms: Polinesia Perancis
nl: Frans-Polynesië
nn: Fransk Polynesia
"no": Fransk Polynesia
oc: Polinesia Francesa
os: Францы Полинези
pl: Polinezja Francuska
pt: Polinésia Francesa
qu: Phransis Pulinisya
ro: Polinezia Franceză
ru: Французская Полинезия
se: Frankriikka Polynesia
sh: Francuska Polinezija
sk: Francúzska Polynézia
sl: Francoska Polinezija
sr: Француска Полинезија
sv: Franska Polynesien
sw: Polynesia ya Kifaransa
ta: பிரெஞ்சு பொலினீசியா
th: เฟรนช์โปลินีเซีย
tr: Fransız Polinezyası
ty: Pōrīnetia Farāni
ug: Fransiyige Qarashliq Polinéziye
uk: Французька Полінезія
vi: Polynésie thuộc Pháp
wo: Polineesi gu Faraas
zh: 法属波利尼西亚

View File

@@ -1,19 +0,0 @@
name:
default: Saint-Pierre-et-Miquelon
af: Saint-Pierre et Miquelon
be: Святы П’ер і Міквелон
da: Saint Pierre og Miquelon
de: Saint-Pierre und Miquelon
en: Saint Pierre and Miquelon
eo: Sankta-Piero kaj Mikelono
es: San Pedro y Miguelón
fi: Saint-Pierre ja Miquelon
fr: Saint-Pierre-et-Miquelon
hr: Sveti Petar i Mikelon
hu: Saint-Pierre és Miquelon
lt: Sen Pjeras ir Mikelonas
lv: Senpjēra un Mikelona
mn: Сент Пьер ба Микелон
sv: Saint-Pierre och Miquelon
tr: Saint-Pierre ve Miquelon
uk: Сен-П'єр і Мікелон

View File

@@ -1,2 +0,0 @@
name:
default: Puerto Rico

View File

@@ -1,29 +0,0 @@
name:
default: Réunion
af: Réunion
ar: ريونيون
be: Руньён
br: Ar Reunion
ca: Illa de la Reunió
da: Reunion
el: Ρεϊνιόν
eo: Reunio
es: La Reunión
fa: رئونیون
fi: Réunion
fr: La Réunion
he: ראוניון
hu: Réunion
is: Réunion
it: Riunione
la: Reunio
lt: Reunionas
lv: Reinjona
mn: Реюньон
pl: Reunion
ru: Реюньон
sl: Reunion
sv: Réunion
th: เรอูนียง
uk: Реюньйон
zh: 留尼汪

View File

@@ -1,2 +0,0 @@
name:
default: Svalbard and Jan Mayen

View File

@@ -1,2 +0,0 @@
name:
default: Sint Maarten

View File

@@ -1,48 +0,0 @@
name:
default: Terres australes et antarctiques françaises
af: Franse Suidelike en Antarktiese Gebiede
an: Territorios Australs Franzeses
ar: الأراضي الجنوبية الفرنسية
be: Французскія Паўднёвыя тэрыторыі
bg: Френски южни и антарктически територии
br: Douaroù Aostral hag Antarktikel Frañs
ca: Terres Australs i Antàrtiques Franceses
cs: Francouzská jižní a antarktická území
da: Franske sydlige og Antarktiske territorier
de: Französische Süd- und Antarktisgebiete
el: Γαλλικά νότια και ανταρκτικά εδάφη
en: French Southern Lands
eo: Francaj Sudaj Teritorioj
es: Tierras Australes y Antárticas Francesas
eu: Frantziaren lurralde austral eta antartikoak
fi: Ranskan eteläiset ja antarktiset alueet
fr: Terres australes et antarctiques françaises
fy: Frânske Súdlike en Antarktyske Lannen
gl: Terras Austrais e Antárticas Francesas
hr: Francuski južni i antarktički teritoriji
hu: Francia déli és antarktiszi területek
id: Daratan Selatan dan Antarktika Perancis
is: Frönsku suðlægu landsvæðin
it: Terre Australi e Antartiche Francesi
ja: フランス領南方・南極地域
ko: 프랑스령 남부와 남극 지역
kw: Tiryow Deghow hag Antarktik Frynkek
lt: Prancūzijos Pietų Sritys
lv: Francijas Dienvidjūru un Antarktikas Zemes
nl: Franse Zuidelijke en Antarctische Gebieden
"no": De franske sørterritorier
oc: Tèrras Australas e Antarticas Francesas
pl: Francuskie Terytoria Południowe i Antarktyczne
pt: Terras Austrais e Antárticas Francesas
ro: Teritoriile australe şi antarctice franceze
ru: Французские Южные и Антарктические территории
sh: Francuske Južne Teritorije
sk: Francúzske južné a antarktické územia
sl: Francoske južne in antarktične dežele
sr: Француске јужне и антарктичке земље
sv: Franska sydterritorierna
ta: பிரெஞ்சு தென்னக நிலங்களும் அண்டாடிக் நிலமும்
tr: Fransız Güney ve Antarktika Toprakları
uk: Французькі Південні та Антарктичні території
vi: Vùng đất phía Nam và châu Nam Cực thuộc Pháp
zh: 法属南部领地

View File

@@ -1,2 +0,0 @@
name:
default: United States Minor Outlying Islands

View File

@@ -1,2 +0,0 @@
name:
default: United States Virgin Islands

View File

@@ -1,68 +0,0 @@
name:
default: Wallis-et-Futuna
af: Wallis-en-Futuna
an: Wallis e Futuna
ar: جزر واليس وفوتونا
be: Уоліс і Футуна
bg: Уолис и Футуна
br: Wallis ha Futuna
ca: Wallis i Futuna
cs: Wallis a Futuna
cy: Wallis a Futuna
da: Wallis og Futuna
de: Wallis und Futuna
dv: ވާލީ އަދި ފުތޫނާ
el: Ουώλλις και Φουτούνα
en: Wallis and Futuna Islands
eo: Valiso kaj Futuno
es: Wallis y Futuna
et: Wallis ja Futuna
eu: Wallis eta Futuna
fa: والیس و فوتونا
fi: Wallis- ja Futunasaaret
fr: Wallis-et-Futuna
fy: Wallis en Fûtûna
ga: Vailís agus Futúna
gl: Wallis e Futuna
he: ואליס ופוטונה
hr: Wallis i Futuna
hu: Wallis és Futuna
id: Wallis dan Futuna
io: Wallis e Futuna Insuli
is: Wallis- og Fútúnaeyjar
it: Wallis e Futuna
ja: ウォリス・フツナ
jv: Wallis lan Futuna
ko: 왈리스 퓌튀나
kw: Wallis ha Futuna
la: Vallis et Futuna
lb: Wallis a Futuna
lt: Walliso ir Futuna salos
lv: Volisa un Futuna
mn: Уоллис ба Футуна
mr: वालिस व फुतुना
ms: Wallis dan Futuna
nl: Wallis en Futuna
nn: Wallis- og Futunaøyane
"no": Wallis- og Futunaøyene
oc: Wallis e Futuna
pl: Wallis i Futuna
pt: Wallis e Futuna
ro: Wallis şi Futuna
ru: Уоллис и Футуна
se: Wallis ja Futuna
sh: Wallis i Futuna
sk: Wallis a Futuna
sl: Wallis in Futuna
sm: Wallis and Futuna
sr: Валис и Футуна
sv: Wallis- och Futunaöarna
sw: Wallis na Futuna
ta: வலிசும் புட்டூனாவும்
th: หมู่เกาะวาลลิสและหมู่เกาะฟุตูนา
tr: Wallis ve Futuna Adaları
ug: Wallis we Futuna Taqim Aralliri
uk: Волліс і Футуна
vi: Wallis và Futuna
wo: Wallis ak Futuna
zh: 瓦利斯和富图纳群岛

View File

@@ -1,2 +0,0 @@
name:
default: Mayotte

View File

@@ -61,13 +61,6 @@ am:
pattern: "dddd"
# Netherlands Antilles (De Nederlandse Antillen)
an:
partition: 58
languages: nl, en, pap
names: !include country-names/an.yaml
# Angola (Angola)
ao:
partition: 85
@@ -76,14 +69,6 @@ ao:
postcode: no
# (Antarctica)
aq:
partition: 181
languages: en, es, fr, ru
names: !include country-names/aq.yaml
postcode: no
# Argentina (Argentina)
ar:
partition: 39
@@ -93,13 +78,6 @@ ar:
pattern: "l?dddd(?:lll)?"
# (American Samoa)
as:
partition: 182
languages: en, sm
names: !include country-names/as.yaml
# Austria (Österreich)
at:
partition: 245
@@ -118,21 +96,6 @@ au:
pattern: "dddd"
# (Aruba)
aw:
partition: 183
languages: nl, pap
names: !include country-names/aw.yaml
postcode: no
# (Aland Islands)
ax:
partition: 184
languages: sv
names: !include country-names/ax.yaml
# Azerbaijan (Azərbaycan)
az:
partition: 119
@@ -221,13 +184,6 @@ bj:
postcode: no
# (Saint Barthélemy)
bl:
partition: 204
languages: fr
names: !include country-names/bl.yaml
# Bermuda (Bermuda)
bm:
partition: 176
@@ -256,13 +212,6 @@ bo:
postcode: no
# Caribbean Netherlands (Caribisch Nederland)
bq:
partition: 250
languages: nl
names: !include country-names/bq.yaml
# Brazil (Brasil)
br:
partition: 121
@@ -290,13 +239,6 @@ bt:
pattern: "ddddd"
# (Bouvet Island)
bv:
partition: 185
languages: "no"
names: !include country-names/bv.yaml
# Botswana (Botswana)
bw:
partition: 122
@@ -332,13 +274,6 @@ ca:
output: \1 \2
# Cocos (Keeling) Islands (Cocos (Keeling) Islands)
cc:
partition: 118
languages: en
names: !include country-names/cc.yaml
# Democratic Republic of the Congo (République démocratique du Congo)
cd:
partition: 229
@@ -450,20 +385,6 @@ cv:
pattern: "dddd"
# Curaçao (Curaçao)
cw:
partition: 248
languages: nl, en
names: !include country-names/cw.yaml
# Christmas Island (Christmas Island)
cx:
partition: 177
languages: en
names: !include country-names/cx.yaml
# Cyprus (Κύπρος - Kıbrıs)
cy:
partition: 114
@@ -683,13 +604,6 @@ ge:
pattern: "dddd"
# French Guiana (Guyane Française)
gf:
partition: 231
languages: fr
names: !include country-names/gf.yaml
# Guernsey (Guernsey)
gg:
partition: 77
@@ -745,13 +659,6 @@ gn:
pattern: "ddd"
# Guadeloupe (Guadeloupe)
gp:
partition: 232
languages: fr
names: !include country-names/gp.yaml
# Equatorial Guinea (Guinea Ecuatorial)
gq:
partition: 12
@@ -789,13 +696,6 @@ gt:
pattern: "ddddd"
# Guam (Guam)
gu:
partition: 187
languages: en, ch
names: !include country-names/gu.yaml
# Guinea-Bissau (Guiné-Bissau)
gw:
partition: 8
@@ -813,20 +713,6 @@ gy:
postcode: no
# (Hong Kong)
hk:
partition: 188
languages: zh-hant, en
names: !include country-names/hk.yaml
# (Heard Island and MaxDonald Islands)
hm:
partition: 189
languages: en
names: !include country-names/hm.yaml
# Honduras (Honduras)
hn:
partition: 56
@@ -1229,13 +1115,6 @@ me:
pattern: "ddddd"
# Saint Martin (Saint Martin)
mf:
partition: 203
languages: fr
names: !include country-names/mf.yaml
# Madagascar (Madagasikara)
mg:
partition: 164
@@ -1289,28 +1168,6 @@ mn:
pattern: "ddddd"
# Macao (Macao)
mo:
partition: 191
languages: zh-hant, pt
names: !include country-names/mo.yaml
postcode: no
# Northern Mariana Islands (Northern Mariana Islands)
mp:
partition: 192
languages: ch, en
names: !include country-names/mp.yaml
# Martinique (Martinique)
mq:
partition: 233
languages: fr
names: !include country-names/mq.yaml
# Mauritania (موريتانيا)
mr:
partition: 149
@@ -1398,13 +1255,6 @@ na:
pattern: "ddddd"
# New Caledonia (Nouvelle-Calédonie)
nc:
partition: 234
languages: fr
names: !include country-names/nc.yaml
# Niger (Niger)
ne:
partition: 226
@@ -1414,13 +1264,6 @@ ne:
pattern: "dddd"
# Norfolk Island (Norfolk Island)
nf:
partition: 100
languages: en, pih
names: !include country-names/nf.yaml
# Nigeria (Nigeria)
ng:
partition: 218
@@ -1519,13 +1362,6 @@ pe:
pattern: "ddddd"
# French Polynesia (Polynésie française)
pf:
partition: 202
languages: fr
names: !include country-names/pf.yaml
# Papua New Guinea (Papua Niugini)
pg:
partition: 71
@@ -1563,13 +1399,6 @@ pl:
output: \1-\2
# Saint Pierre and Miquelon (Saint-Pierre-et-Miquelon)
pm:
partition: 236
languages: fr
names: !include country-names/pm.yaml
# Pitcairn Islands (Pitcairn Islands)
pn:
partition: 113
@@ -1580,13 +1409,6 @@ pn:
output: \1 \2
# Puerto Rico (Puerto Rico)
pr:
partition: 193
languages: es, en
names: !include country-names/pr.yaml
# Palestinian Territory (Palestinian Territory)
ps:
partition: 194
@@ -1631,13 +1453,6 @@ qa:
postcode: no
# (Réunion)
re:
partition: 235
languages: fr
names: !include country-names/re.yaml
# Romania (România)
ro:
partition: 170
@@ -1745,13 +1560,6 @@ si:
pattern: "dddd"
# (Svalbard and Jan Mayen)
sj:
partition: 197
languages: "no"
names: !include country-names/sj.yaml
# Slovakia (Slovensko)
sk:
partition: 172
@@ -1831,13 +1639,6 @@ sv:
pattern: "dddd"
# (Sint Maarten)
sx:
partition: 249
languages: nl, en
names: !include country-names/sx.yaml
# Syria (سوريا)
sy:
partition: 104
@@ -1873,13 +1674,6 @@ td:
postcode: no
# French Southern Lands (Terres australes et antarctiques françaises)
tf:
partition: 132
languages: fr
names: !include country-names/tf.yaml
# Togo (Togo)
tg:
partition: 243
@@ -2009,15 +1803,6 @@ ug:
postcode: no
# (United States Minor Outlying Islands)
um:
partition: 198
languages: en
names: !include country-names/um.yaml
postcode:
pattern: "96898"
# United States (United States)
us:
partition: 2
@@ -2083,13 +1868,6 @@ vg:
output: VG\1
# (United States Virgin Islands)
vi:
partition: 199
languages: en
names: !include country-names/vi.yaml
# Vietnam (Việt Nam)
vn:
partition: 75
@@ -2107,13 +1885,6 @@ vu:
postcode: no
# Wallis and Futuna Islands (Wallis-et-Futuna)
wf:
partition: 238
languages: fr
names: !include country-names/wf.yaml
# Samoa (Sāmoa)
ws:
partition: 131
@@ -2138,13 +1909,6 @@ ye:
postcode: no
# Mayotte (Mayotte)
yt:
partition: 200
languages: fr
names: !include country-names/yt.yaml
# South Africa (South Africa)
za:
partition: 76

Some files were not shown because too many files have changed in this diff Show More