Compare commits

...

429 Commits

Author SHA1 Message Date
Sarah Hoffmann
9a979b7429 Merge pull request #3951 from Itz-Agasta/cli
Feat: Adds layer filtering option to search cli command
2026-01-29 09:58:06 +01:00
Itz-Agasta
6ad87db1eb Updates layer selection to allow optional default
- Modifies layer argument handling to permit no default layers appropriate.
- Update the help text for the layer parameter in the reverse command
2026-01-29 11:33:21 +05:30
Sarah Hoffmann
f4820bed0e Merge pull request #3950 from jayaddison/fixup/sql-debug-output-escaping
Fixup: add single-quote escaping within debug message
2026-01-28 20:30:11 +01:00
Itz-Agasta
bf6eb01d68 Adds layer filtering option to search command
Introduces a cli argument to restrict search results
to specified data layers, enabling more targeted queries.
2026-01-28 12:16:43 +05:30
James Addison
f07676a376 Fixup: add single-quote escaping within debug message 2026-01-28 01:27:53 +00:00
Sarah Hoffmann
67ecf5f6a0 Merge pull request #3943 from Itz-Agasta/test_fix
Tests: Replace eval() with ast.literal_eval() for safer parsing
2026-01-25 10:10:15 +01:00
Itz-Agasta
e77a4c2f35 Switch to ast.literal_eval for dict parsing
Due to  some test data in the BDD feature files includes Python raw strings and escape sequences that standard json.loads() cannot parse switching to safer Python literal evaluation
for converting string representations of dictionaries.
2026-01-24 15:32:47 +05:30
Itz-Agasta
9fa980bca2 Replaces eval with json.loads for safer dict parsing
Switches from eval to json.loads when parsing string representations
of dictionaries to  prevent arbitrary code
execution.
2026-01-24 15:32:47 +05:30
Sarah Hoffmann
fe773c12b2 Merge pull request #3946 from lonvia/enable-entrances-for-reverse
Enable entrance lookup for reverse and lookup
2026-01-23 22:10:43 +01:00
Sarah Hoffmann
cc96912580 Merge pull request #3906 from AyushDharDubey/fix/issue_2463-Use-search_name-table-for-TIGER-data-imports-on-'dropped'-databases
Use `search_name` as fallback for TIGER imports when update tables are dropped
2026-01-23 20:52:40 +01:00
Sarah Hoffmann
77a3ecd72d Merge pull request #3945 from lonvia/fix-starlette-tests
Update Starlette tests to using their TestClient
2026-01-23 20:45:15 +01:00
Sarah Hoffmann
6a6a064ef7 enable entrances for reverse and lookup 2026-01-23 17:38:47 +01:00
Sarah Hoffmann
35b42ad9ce update Starlette tests to using their TestClient 2026-01-23 16:28:13 +01:00
Sri Charan Chittineni
c4dc2c862e fix mypy typing for Starlette state object (#3944) 2026-01-22 13:21:34 +01:00
Sarah Hoffmann
7e44256f4a Merge pull request #3939 from lonvia/more-table-constraints
Add NOT NULL and UNIQUE contraints on tables
2026-01-14 15:04:45 +01:00
Ayush Dhar Dubey
eefd0efa59 update test frozen db: new tiger import mechanism 2026-01-09 17:47:07 +05:30
Ayush Dhar Dubey
2698382552 permit import of tiger after freeze 2026-01-09 17:35:01 +05:30
Ayush Dhar Dubey
954771a42d Add fallback search mechanism for dropped databases lookup 2026-01-09 17:35:01 +05:30
Sarah Hoffmann
e47601754a do not attempt to delete old data for newly created placex entries 2026-01-07 17:08:28 +01:00
Sarah Hoffmann
2cdf2db184 add NOT NULL and UNIQUE constraints where possible 2026-01-07 15:46:05 +01:00
Sarah Hoffmann
5200e11f33 ignore countries without geometry or country code for location_area 2026-01-07 11:43:32 +01:00
Sarah Hoffmann
ba1fc5a5b8 do not insert entries with empty name into search name 2026-01-07 11:27:55 +01:00
Sarah Hoffmann
d35a71c123 ensure correct indexed_status transitions 2026-01-07 11:12:35 +01:00
Sarah Hoffmann
e31862b7b5 make sure that importance is always set to a non-null value
Secondary importance might return invalid values in some cases.
2026-01-07 10:29:45 +01:00
Sarah Hoffmann
9ac5e0256d make sure array_merge() never returns null 2026-01-07 10:22:03 +01:00
Sarah Hoffmann
a4a2176ded immediately terminate indexing when a task catches an exception 2026-01-07 09:58:40 +01:00
Sarah Hoffmann
f30fcdcd9d BDD: make sure randomly generated names always contain a letter 2026-01-07 09:58:40 +01:00
otbutz
77b8e76be6 Add PR template (#3934) 2026-01-05 17:42:35 +01:00
Sarah Hoffmann
20a333dd9b Merge pull request #3930 from lonvia/remove-new-query-log-table
Remove unused new_query_log table
2026-01-02 09:58:05 +01:00
Sarah Hoffmann
084e1b8177 remove unused new_query_log table 2026-01-01 20:30:37 +01:00
Sarah Hoffmann
2e2ce2c979 fix version counts 2026-01-01 14:42:12 +01:00
Sarah Hoffmann
99643aa0e9 ignore postcode areas in countries without postcodes properly 2026-01-01 11:21:40 +01:00
Sarah Hoffmann
c05b8f241c make sure we use exactly the same table structure as osm2pgsql 2025-12-31 00:21:27 +01:00
Sarah Hoffmann
da94d7eea3 need an analyse after the migration 2025-12-30 19:49:07 +01:00
Sarah Hoffmann
f9864b7ec7 grant access right to www user for new postcode table 2025-12-30 17:48:33 +01:00
Sarah Hoffmann
df4abfd5cc Merge pull request #3926 from lonvia/rework-postcode-handling
Reorganise postcode handling
2025-12-30 15:54:33 +01:00
Sarah Hoffmann
42d139a5d0 analyze postcode table during import 2025-12-30 15:21:20 +01:00
Sarah Hoffmann
f2110e12d6 simplify postcode area for lookups 2025-12-30 15:21:20 +01:00
Sarah Hoffmann
3bcd1aa721 adapt BDD tests for new postcode table structure 2025-12-30 15:21:20 +01:00
Sarah Hoffmann
354aa07cad adapt unit tests to new postcode algorithms 2025-12-30 15:21:18 +01:00
Sarah Hoffmann
deb6654cfd add migration for new postcode table 2025-12-30 15:20:46 +01:00
Sarah Hoffmann
6a67cfcddf adapt search frontend to new postcode table 2025-12-30 15:20:46 +01:00
Sarah Hoffmann
f9cf320794 set custom postcode extents for some countries 2025-12-30 15:20:46 +01:00
Sarah Hoffmann
d1cb578535 rework postcode computation
Now adds areas to location_postcodes, ignores postcode points
inside areas and supports customizable extents.
2025-12-30 15:20:46 +01:00
Sarah Hoffmann
a97b5d97cb add support for custom per-country postcode extents 2025-12-30 15:20:46 +01:00
Sarah Hoffmann
9ec607b556 change confusing value in debug output for missing importance 2025-12-30 15:20:46 +01:00
Sarah Hoffmann
89821d01e0 reorganise layout of location_postcode table
Also renames the table as this will make it easier to migrate.
2025-12-30 15:20:46 +01:00
Sarah Hoffmann
7ef3f99fa4 drop new place sub-tables on freezing 2025-12-30 15:20:46 +01:00
Sarah Hoffmann
0aa9eee3e7 remove special casing for postcodes in trigger code 2025-12-30 15:20:46 +01:00
Sarah Hoffmann
340fe64e8b put postcodes in extra table on import 2025-12-30 15:20:46 +01:00
Sarah Hoffmann
0b11dd0eba Merge pull request #3925 from Aditya30ag/fix-typo-place-addressline-test
Fix typo in place_addressline table name in tests
2025-12-30 14:36:59 +01:00
Aditya30ag
3b182afa72 Fix typo in place_addressline table name in tests 2025-12-30 17:40:08 +05:30
Sarah Hoffmann
ae77a9512a Merge pull request #3919 from 28Deepakpandey/Fix-docs-locale-typo
Fix: Locale → Locales references in docs
2025-12-28 15:55:49 +01:00
28Deepakpandey
f7ba1fc9e1 Fix: corrected Locale → Locales references and ensured proper casing in docs 2025-12-26 02:46:26 +05:30
Sarah Hoffmann
26e62fda19 Merge pull request #3913 from AyushDharDubey/fix/issue_3909
Reuse Configuration instance in Locales
2025-12-22 16:38:27 +01:00
Ayush Dhar Dubey
4fd616254a update Locales constructor:
expect output names as argument and avoid redundant configuration initialization
2025-12-20 19:15:33 +05:30
Ayush Dhar Dubey
049164086a fix: ensure Locales is not initialized when provided in options 2025-12-20 19:12:15 +05:30
Sarah Hoffmann
5e965d5216 Merge pull request #3910 from lonvia/update-ps-names
Update default names for Palestinian Territories
2025-12-15 21:16:56 +01:00
Sarah Hoffmann
2097401b12 update default names for Palestinian Territories 2025-12-15 19:25:15 +01:00
Sarah Hoffmann
58e56ec53d Merge pull request #3901 from AyushDharDubey/fix/issue_3829-use-mwparserfromhell-to-parse-sp-wiki-page
Replace regex with `mwparserfromhell` based MW WikiCode Parsing for Special Phrases
2025-12-08 11:51:50 +01:00
Ayush Dhar Dubey
fe170c9286 add mwparserfromhell to apt prerequisites for CI build 2025-12-08 15:51:35 +05:30
Ayush Dhar Dubey
0c5af2e3e4 update new dependency instructions: mwparserfromhell 2025-12-08 15:01:35 +05:30
Sarah Hoffmann
681daeea29 Merge pull request #3902 from lonvia/fix-only-closest-housenumber
Reverse: only return housenumbers near street
2025-12-08 09:48:29 +01:00
Ayush Dhar Dubey
49454048c4 use mwparserfromhell to parse SP wiki page reliably 2025-12-08 11:01:14 +05:30
Ayush Dhar Dubey
4919240377 test for cell-per-line format 2025-12-08 11:01:14 +05:30
Ayush Dhar Dubey
56cb183c4e update sp test content
add latest <generator>MediaWiki 1.43.5</generator>
add test case for one-row-per-line
2025-12-08 10:59:10 +05:30
Sarah Hoffmann
35060164ab reverse: only return housenumbers near street 2025-12-07 11:00:23 +01:00
Sarah Hoffmann
4cfc1792fb Merge pull request #3899 from lonvia/improve-reverse-performance
Streamline reverse lookup slightly
2025-12-07 09:39:10 +01:00
Sarah Hoffmann
3bb5d00848 avoid extra query for finding closest housenumber in reverse 2025-12-05 17:09:13 +01:00
Sarah Hoffmann
b366b9df6f reverse: avoid interpolation lookup when result is already perfect 2025-12-05 17:08:46 +01:00
Sarah Hoffmann
6b12501c7a Merge pull request #3898 from lonvia/fix-country-restriction
Fix comparision between country tokens and country restriction
2025-12-04 20:03:14 +01:00
Sarah Hoffmann
ffd5c32f17 fix comparision between countr tokens and country restriction 2025-12-04 18:29:25 +01:00
Sarah Hoffmann
6c8869439f Merge pull request #3897 from lonvia/test-psycopg-33
Allow psycopg 3.3 back
2025-12-04 17:10:55 +01:00
Sarah Hoffmann
8188946394 ignore typing isssue 2025-12-03 10:22:11 +01:00
Sarah Hoffmann
19134cc15c exclude psycopg 3.3.0 which breaks named cursors 2025-12-03 10:22:04 +01:00
Sarah Hoffmann
d0b9aac400 Merge pull request #3895 from lonvia/flaky-test
Fix flaky test around postcode word match penalties
2025-12-02 12:46:43 +01:00
Sarah Hoffmann
48d13c593b fix flaky test around postcode word match penalties 2025-12-02 11:15:37 +01:00
Sarah Hoffmann
96d04e3a2e Merge pull request #3894 from lonvia/country-names-with-word-lookup
Add normalized form of country names to coutry tokens in word table
2025-12-01 14:54:24 +01:00
Sarah Hoffmann
23db1ab981 avoid most recent psycopg 3.3 release 2025-12-01 14:23:36 +01:00
Sarah Hoffmann
cd1b1736a9 add migration for changed country token format 2025-12-01 13:10:18 +01:00
Sarah Hoffmann
9447c90b09 adapt tests to new country token format 2025-12-01 13:10:18 +01:00
Sarah Hoffmann
81c6cb72e6 add normalised country name to word table
Country tokens now follow the usual convetion of having the
normalized version in the word column and the extra info about the
country code in the info column.
2025-12-01 13:10:18 +01:00
Sarah Hoffmann
f2a122c5c0 Merge pull request #3893 from lonvia/nature-reserve
Prefer leisure=nature_reserve as main tag over boundary=protected_area
2025-12-01 11:36:17 +01:00
Sarah Hoffmann
57ef0e1f98 prefer leisure=nature_reserve as main tag 2025-12-01 09:47:55 +01:00
Sarah Hoffmann
922667b650 Merge pull request #3892 from daishu0000/master
Add success message to setup.log: related to #3891
2025-11-30 14:13:51 +01:00
Sarah Hoffmann
fba803167c fix imprecise import 2025-11-30 11:50:55 +01:00
daishu0000
782df52ea0 Add success message to db log 2025-11-30 01:53:40 +08:00
Sarah Hoffmann
c36da68a48 Merge pull request #3890 from mtmail/remove-nat-name
Skip nat_name in default import
2025-11-28 14:13:30 +01:00
marc tobias
716de13bc9 Skip nat_name in default import 2025-11-28 11:35:35 +01:00
Sarah Hoffmann
1df56d7548 Merge pull request #3889 from lonvia/improve-linkage-code
Small improvements to place linking code
2025-11-26 22:11:11 +01:00
Sarah Hoffmann
9cfef7a31a prefer wikidata over name match when linking 2025-11-26 17:44:47 +01:00
Sarah Hoffmann
139678f367 fix linkage removal when nothing has changed 2025-11-26 17:03:19 +01:00
Sarah Hoffmann
e578c60ff4 Merge pull request #3874 from vytas7/falcon-4.2-typing
Adapt type annotations to Falcon App type changes
2025-11-16 16:12:35 +01:00
Vytautas Liuolia
7b4a3c8500 Add from __future__ import annotations to delay evaluation 2025-11-16 14:41:25 +01:00
Vytautas Liuolia
7751f9a6b6 Adapt type annotations to Falcon App type changes
See also: https://falcon.readthedocs.io/en/latest/api/typing.html#generic-app-types
2025-11-10 20:09:17 +01:00
Sarah Hoffmann
303ac42b47 Merge pull request #3862 from mtmail/skip-all-zero-postcodes
Postcode sanetizer now skips values which are only zeros
2025-10-31 10:36:05 +01:00
Sarah Hoffmann
6a2d2daad5 Merge pull request #3863 from lonvia/improve-bdd-test-names
Add custom pytest collector for BDD feature files
2025-10-31 10:19:56 +01:00
Sarah Hoffmann
a51c771107 disable improved BDD test naming for pytest < 8
Needs the improved test collector introduced in pytest 8.0.
2025-10-30 20:50:00 +01:00
Sarah Hoffmann
55547723bf add custom pytest collector for BDD feature files 2025-10-30 17:56:23 +01:00
marc tobias
362088775f postcode sanetizer skips postcodes which are only zeros 2025-10-30 13:45:29 +01:00
Sarah Hoffmann
9a13b62fb9 prepare release 5.2.0 2025-10-29 10:01:30 +01:00
Sarah Hoffmann
3ecda751c4 Merge pull request #3861 from lonvia/force-extra-tags
Force inclusion of extra tags when Nominatim internally depends on them
2025-10-29 08:51:40 +01:00
Sarah Hoffmann
5d4c29b84b force inclusion of extratags used directly by Nominatim 2025-10-28 17:20:17 +01:00
Sarah Hoffmann
f1fbc04f33 harmonize use of callback with set_entrance_filter
All other functions except a simple function, so do this here as well.
2025-10-28 14:33:45 +01:00
Sarah Hoffmann
353c985b9f Merge pull request #3859 from lonvia/fix-entrance-addresses
Move entrances to a separate table
2025-10-24 13:38:21 +02:00
Sarah Hoffmann
2dda9079f0 add BDD tests for importing into the new place_entrance table 2025-10-24 10:52:25 +02:00
Sarah Hoffmann
4c91a0bc8d fix syntax in presets 2025-10-24 09:43:06 +02:00
Sarah Hoffmann
31c8ec6db0 add documentation for entrance table configuration 2025-10-23 20:53:59 +02:00
Sarah Hoffmann
e2330ff4c1 add migration for separate entrance table 2025-10-23 17:25:20 +02:00
Sarah Hoffmann
589825d37e adapt tests for extra place_entrance table 2025-10-23 17:25:20 +02:00
Sarah Hoffmann
a93113bc44 use extra place_entrance table 2025-10-23 17:25:20 +02:00
Sarah Hoffmann
b042eca382 move entrances into extra table 2025-10-23 17:25:20 +02:00
Sarah Hoffmann
d202a8f7d8 Merge pull request #3857 from lonvia/leisure-garden
Be more conservative when including leisure=garden/commons
2025-10-22 14:03:19 +02:00
Sarah Hoffmann
af6386bd68 move some leisure features into manmade layer 2025-10-22 11:27:25 +02:00
Sarah Hoffmann
862bfdf6fb correct default values for layer on reverse 2025-10-22 11:27:25 +02:00
Sarah Hoffmann
28029edc8b exclude unamed gardens and commons
These are mostly private gardens and small streches of green
not of interest for a general search.

ddd
2025-10-22 11:26:58 +02:00
Sarah Hoffmann
d6e9196177 Merge pull request #3855 from hasandiwan/master
force layer to be address
2025-10-22 11:24:40 +02:00
Hasan Diwan
e0a750e089 force layer to be address 2025-10-22 07:55:10 +00:00
Sarah Hoffmann
93b2a0f194 update CI to test against PostgreSQL 18 2025-10-20 18:50:30 +02:00
Sarah Hoffmann
aa3fce6852 correct default setting for addressdetails parameter in lookup
Fixes #3850.
2025-10-11 09:20:10 +02:00
Sarah Hoffmann
535ffc1e3f Merge pull request #3840 from lonvia/normalize-penalties
Improve termination condition for forward search
2025-09-12 21:59:39 +02:00
Sarah Hoffmann
77ed4635f2 Merge pull request #3836 from Johannes-Andersen/chore/i18NorwegianCountries
chore: update no,nb,nn country-names translation
2025-09-12 21:01:03 +02:00
Sarah Hoffmann
7715a9d500 fix new mypy issue 2025-09-12 19:32:49 +02:00
Johannes Andersen
58d570ca8a chore: update no,nb,nn country-names translation 2025-09-12 18:20:56 +02:00
Sarah Hoffmann
5a8aa6cce4 adapt tests to new penalties 2025-09-12 17:45:22 +02:00
Sarah Hoffmann
72592da0cc reduce penalty for artificial housenumbers 2025-09-12 17:44:54 +02:00
Sarah Hoffmann
193d6c4173 in-word penalty for final address token 2025-09-12 12:05:29 +02:00
Sarah Hoffmann
4fd881bcb2 housenumber and postcode cross penalties for partials 2025-09-12 11:50:01 +02:00
Sarah Hoffmann
54620f9566 base penalty for housenumber searches on similar address searches 2025-09-12 10:52:42 +02:00
Sarah Hoffmann
42b687f545 stop searching earlier after the first results was found 2025-09-12 10:01:13 +02:00
Sarah Hoffmann
43ffceff27 remove base penalty for postcodes
This is a relict from having base penalties for all terms.
2025-09-12 09:45:57 +02:00
Sarah Hoffmann
2fb03cd103 Merge pull request #3835 from lonvia/remove-japanese-variants
Remove japanese variants
2025-09-11 17:45:30 +02:00
Sarah Hoffmann
8d3d24a1e4 Merge pull request #3834 from lonvia/neighbourhoods
Improve handling of neighbourhoods in addresses
2025-09-11 15:25:01 +02:00
Sarah Hoffmann
8efdab1d6f remove japanese variants
Variants are only meant for word morphing which does not exist
for Kanji.
2025-09-11 15:20:57 +02:00
Sarah Hoffmann
1d1d80e1e3 adapt BDD tests for new address ranks 2025-09-11 11:56:39 +02:00
Sarah Hoffmann
670cf98f93 fix query time logging for structured queries 2025-09-11 10:54:02 +02:00
Sarah Hoffmann
433c40cd68 downgrade neighbourhoods and landuses
Neighbourhoods shoud be below a quarter hierarchically speaking, so
downgrade them a bit. Consider named landuses the area form of a
meighbourhood and put it at the same level.
2025-09-11 10:20:33 +02:00
Sarah Hoffmann
a049569020 downgrade Japanese boundaries one level
Definition is shifted by one compared to other countries,
see https://wiki.openstreetmap.org/wiki/Tag:boundary%3Dadministrative
2025-09-11 10:01:24 +02:00
Sarah Hoffmann
bf49f6a46f Merge pull request #3833 from lonvia/rework-logging
Introduce generic query statistics and make log output configurable
2025-09-11 08:46:44 +02:00
Sarah Hoffmann
45a44f1411 export QueryStatistics type 2025-09-10 21:40:39 +02:00
Sarah Hoffmann
5a2bfd7a19 add documentation for library API 2025-09-10 21:38:09 +02:00
Sarah Hoffmann
fd12d2e9f3 add additional stats for search queries 2025-09-10 20:49:46 +02:00
Sarah Hoffmann
3d0867ff16 make log output configurable 2025-09-10 20:11:46 +02:00
Sarah Hoffmann
177b16b89b use new QueryStatistics in API server 2025-09-10 11:52:06 +02:00
Sarah Hoffmann
0b7bde2500 introduce parameter for saving query statistics 2025-09-10 10:24:20 +02:00
Sarah Hoffmann
7ac3591433 Merge pull request #3830 from lonvia/split-transliteration
Improve word match penalty for scripts without word boundaries
2025-09-09 10:28:21 +02:00
Sarah Hoffmann
07c2907064 split normalized word when transliteration is split up 2025-09-08 22:58:01 +02:00
Sarah Hoffmann
355cbcc7b8 Merge pull request #3828 from lonvia/code-cleanup
Code cleanup
2025-09-06 16:59:52 +02:00
Sarah Hoffmann
8339c2b928 no longer accept None in result maker functions 2025-09-06 11:09:40 +02:00
Sarah Hoffmann
341c09ee95 remove unused functions 2025-09-06 11:09:40 +02:00
Sarah Hoffmann
b0b909be93 Merge pull request #3827 from lonvia/rework-query-timeouts
Apply request timeouts while waiting for a connection
2025-09-06 11:08:55 +02:00
Sarah Hoffmann
bf604e36ee add test for timeout class 2025-09-05 23:31:09 +02:00
Sarah Hoffmann
3a50f749dd apply request timeout also while waiting for a connection from pool 2025-09-05 23:31:09 +02:00
Sarah Hoffmann
563255202d read request_timeout configuration only once 2025-09-05 09:18:50 +02:00
Sarah Hoffmann
94d22bbdac Merge pull request #3825 from emlove/entrance-docs
Some docs for the entrances output
2025-09-03 21:18:15 +02:00
Emily Love Watson
32d26f12c4 Add example entrances output 2025-09-03 09:55:43 -05:00
Sarah Hoffmann
0f324c8cb2 Merge pull request #3826 from lonvia/decrease-default-pool-size
reduce default DB pool size
2025-09-03 08:53:54 +02:00
Emily Love Watson
1e3b56d215 Some docs for the entrances output 2025-09-02 21:56:56 -05:00
Sarah Hoffmann
e855552e01 reduce default DB pool size 2025-09-02 22:10:30 +02:00
Sarah Hoffmann
79a1907c49 Merge pull request #3807 from emlove/return-entrance-location
Index and return entrance coordinates for places
2025-08-30 20:08:33 +02:00
Emily Love Watson
91e345f77f Store entrance fields as columns on table 2025-08-29 10:26:29 -05:00
Emily Love Watson
d0ad65f696 Select all entrances for results in one query 2025-08-29 10:26:29 -05:00
Emily Love Watson
e916d27b7c Update entrances when entrance nodes are updated 2025-08-29 10:26:29 -05:00
Emily Love Watson
823ad5d279 Update entrances schema 2025-08-29 10:26:29 -05:00
Emily Love Watson
048d571e46 Index and return entrance coordinates for indexed locations 2025-08-29 10:25:44 -05:00
Sarah Hoffmann
f5e4b74c38 Merge pull request #3823 from lonvia/fix-postcode-difference
Fix difference computation on postcode updates
2025-08-29 17:06:08 +02:00
Sarah Hoffmann
c2a311e69c fix poscode update computation: use distance 2025-08-29 15:10:27 +02:00
Sarah Hoffmann
5968f7d646 Merge pull request #3816 from anqixxx/locale-doc-update
Update to library locale documentation in light of refactor
2025-08-28 22:06:41 +02:00
anqixxx
4cdd2526b6 Updated and restructured library documentation to include Locale changes
Updated Getting Started Docs

Added documentation for Result Handling

removed api documentation
2025-08-27 09:18:16 -07:00
Sarah Hoffmann
4ff7696ed3 Merge pull request #3820 from mtmail/berlin-ost-hauptbahnhof
Sanetizer no longer strips name parts in brackets when more parts follow
2025-08-23 17:17:39 +02:00
marc tobias
247afe1f56 sanetizer no longer strips name parts in brackets when more parts follow 2025-08-23 01:06:35 +02:00
Sarah Hoffmann
6f74141fa4 Merge pull request #3819 from lonvia/ignore-survey
Ignore survey:* tags
2025-08-22 22:10:02 +02:00
Sarah Hoffmann
75ccf97de3 ignore survey:* tags 2025-08-22 10:59:58 +02:00
Sarah Hoffmann
196de9e974 Merge pull request #3796 from anqixxx/locale-refactor
Localize() + Results refactor
2025-08-13 14:08:42 +02:00
anqixxx
6b627df4fb Locales and localization refactor with Locales as a localizer object.
Removed auto-localization from search/search_address APIs (now explicit), simplified AddressLines to subclass List[AddressLine], made display_name a computed property in Results instead of field and removed result-localization circular dependencies
2025-08-12 08:05:37 -04:00
Sarah Hoffmann
b7d77b9b43 avoid symbolic link to files in packaging
Hatch cannot handle those correctly and will add a symbolic link to the
source package.
2025-08-06 21:59:13 +02:00
Sarah Hoffmann
7e84d38a92 Merge pull request #3811 from lonvia/fix-frequent-terms-with-viewbox
Don't restrict to viewbox for frequent terms
2025-08-06 21:10:07 +02:00
Sarah Hoffmann
c7df8738ed fix typing issue with latest falcon version 2025-08-06 20:08:10 +02:00
Sarah Hoffmann
0045203092 don't restrict to viewbox for frequent terms
All searched places may be outside the viewbox in which case the
restriction means that there are no results at all. Add the penalty for
being outside the viewbox earlier instead and then cut the list.
2025-08-06 17:27:52 +02:00
Sarah Hoffmann
b325413486 Merge pull request #3808 from lonvia/avoid-st-relate
Replace ST_Relate by shortcut functions
2025-08-06 16:28:51 +02:00
Sarah Hoffmann
6270c90052 replace ST_Relate by shortcut functions
For some reason ST_Relate returns wrong results in the context of
the trigger on Debian Trixie. Works fine with the Postgis version
from postgresql.org.
2025-08-06 14:43:07 +02:00
Sarah Hoffmann
a7709c768d add test for reverse with address layer and inherited address 2025-07-31 22:25:55 +02:00
Sarah Hoffmann
47c0a101b9 Merge pull request #3799 from lonvia/reduce-coordinate-precision
Reduce coordinate precision of centroids and interpolation lines
2025-07-30 14:50:36 +02:00
Sarah Hoffmann
64bb8c2a9c Merge pull request #3800 from lonvia/improve-style-docs
Improvements to documentation for custom import styles
2025-07-30 14:50:17 +02:00
Sarah Hoffmann
194b607491 Merge pull request #3797 from mtmail/database-version-not-found
Better hint to user if database import didnt finish
2025-07-30 12:08:10 +02:00
marc tobias
9bad3b1e61 Better hint to user if database import didnt finish 2025-07-30 10:25:14 +02:00
Sarah Hoffmann
69e882096c clarify what merging means 2025-07-29 23:04:14 +02:00
Sarah Hoffmann
f300b00c2d docs: add a list of available topics 2025-07-29 22:59:02 +02:00
Sarah Hoffmann
242fcc6e4d adapt BDD tests to different rounding of reduce precision 2025-07-29 22:35:55 +02:00
Sarah Hoffmann
83c6f27f5c reduce precision of interpolations to OSM precision 2025-07-29 22:35:47 +02:00
Sarah Hoffmann
1111597db5 reduce precision of computed centroids to 7 digits 2025-07-29 21:25:14 +02:00
Sarah Hoffmann
866e6bade9 Merge pull request #3789 from lonvia/align-deferred-delete-limits
Align limits for deferring delete and reindexing on insert
2025-07-22 11:15:56 +02:00
Sarah Hoffmann
4cbbe04f7f align limits for deferring delete and reindexing on insert
Right now when a boundary with an area between 1 and 2 broke, it
was deleted but on reinsert afer repair, the addresses are not updated
resulting in inconsistent data.
2025-07-21 16:11:06 +02:00
Sarah Hoffmann
e1cef3de0a remove unused code 2025-07-21 11:36:57 +02:00
Sarah Hoffmann
c6088cb4e7 Merge pull request #3785 from lonvia/raise-python-to-39
Raise minimum required Python version to 3.9
2025-07-19 23:02:13 +02:00
Sarah Hoffmann
a725cab2fc run old-version CI against oldest supported Python 2025-07-19 19:50:01 +02:00
Sarah Hoffmann
8bb53c22be raise minimum supported Python version to 3.9 2025-07-19 15:23:17 +02:00
Sarah Hoffmann
8a96e4f802 Merge pull request #3781 from lonvia/partial-address-index-lookup
Reduce number of tokens used for index lookups during search
2025-07-15 10:11:12 +02:00
Sarah Hoffmann
a9cd706bb6 adapt test to new lookup limits 2025-07-14 14:21:09 +02:00
Sarah Hoffmann
09b5ea097b restrict pre-selection by postcode to country 2025-07-14 14:21:09 +02:00
Sarah Hoffmann
e111257644 restrict name-only address searches early by postcode 2025-07-14 14:21:09 +02:00
Sarah Hoffmann
93ac1023f7 restrict name-only search more 2025-07-14 14:21:09 +02:00
Sarah Hoffmann
1fe2353682 restrict postcode distance computation to within country 2025-07-14 14:21:09 +02:00
Sarah Hoffmann
6d2b79870c only use most infrequent tokens for search index lookup 2025-07-14 14:18:22 +02:00
Sarah Hoffmann
621d8e785b Merge pull request #3779 from lonvia/fix-zero-devision-direction
Fix direction factor computation on empty strings
2025-07-11 14:51:00 +02:00
Sarah Hoffmann
830307484b Merge pull request #3777 from lonvia/harmonize-transition-penalties
Clean up word transition penalty assignment for searches
2025-07-11 14:17:48 +02:00
Sarah Hoffmann
5d6967a1d0 Merge pull request #3778 from lonvia/remove-log-db-setting
Remove defaults and documentations for LOG_DB setting
2025-07-11 14:17:24 +02:00
Sarah Hoffmann
26903aec0b add BDD test for empty queries 2025-07-11 14:16:48 +02:00
Sarah Hoffmann
c39183e3a5 remove any references to website setup or refresh
Does no longer exist.
2025-07-11 11:51:49 +02:00
Sarah Hoffmann
21ef3be433 fix direction factor computation on empty strings 2025-07-11 11:25:14 +02:00
Sarah Hoffmann
99562a197e remove LOG_DB setting, not implemented anymore 2025-07-11 11:15:41 +02:00
Sarah Hoffmann
fe30663b21 remove penalty from TokenRanges
The parameter is no longer needed.
2025-07-11 11:01:22 +02:00
Sarah Hoffmann
73ee17af95 adapt tests for new function signatures 2025-07-11 11:01:22 +02:00
Sarah Hoffmann
b9252cc348 reduce maximum number of SQL queries per search 2025-07-11 11:01:22 +02:00
Sarah Hoffmann
71025f3f43 fix order of address rankings prefering longest words 2025-07-11 11:01:21 +02:00
Sarah Hoffmann
e4b671f8b1 reinstate penalty for partial only matches 2025-07-11 11:01:21 +02:00
Sarah Hoffmann
7ebd121abc give word break slight advantage towards continuation
prefers longer words
2025-07-11 11:01:21 +02:00
Sarah Hoffmann
4634ad0720 rebalance word transition penalties 2025-07-11 11:01:21 +02:00
Sarah Hoffmann
4a9253a0a9 simplify QueryNode penalty and initial assignment 2025-07-11 11:01:09 +02:00
Sarah Hoffmann
1aeb8a262c Merge pull request #3774 from lonvia/remove-postcodes-from-nameaddressvector
Do not add postcodes from postcode boundaries to address vector
2025-07-08 17:23:05 +02:00
Sarah Hoffmann
ef7e842702 Merge pull request #3773 from lonvia/small-countries
Reduce area for geometry rank for very small countries
2025-07-08 15:01:37 +02:00
Sarah Hoffmann
ec42fda1bd do not add postcodes from postcode boundaries to address vector
Postcodes will be found through a special search, so we can save
the space.
2025-07-08 14:49:16 +02:00
Sarah Hoffmann
287ba2570e reduce area for geometry rank for very small countries 2025-07-08 13:50:20 +02:00
Sarah Hoffmann
4711deeccb Merge pull request #3772 from lonvia/fix-index-use-deletable
split up query for deletable endpoint by osm type
2025-07-08 13:49:31 +02:00
Sarah Hoffmann
cf9e8d6b8e split up query for deletable endpoint by osm type
This is needed to ensure index use on placex.
2025-07-08 11:03:29 +02:00
Sarah Hoffmann
06d5ab4c2d Merge pull request #3770 from lonvia/split-place-search
Split up SQL generation code for searches with and without housenumbers
2025-07-07 17:52:47 +02:00
Sarah Hoffmann
e327512667 adapt BDD test to refusal to search POI names with hnr only 2025-07-07 16:14:58 +02:00
Sarah Hoffmann
3e04eb2ffe increase penalty on mismatching postcodes for address searches
Otherwise there is an imbalance towards matching housenumbers
instead of the actual street (where no housenumber exists).
2025-07-07 16:07:32 +02:00
Sarah Hoffmann
970d81fb27 sort housenumber parents by accuracy first
Sorting them by presence of housenumber only will give an undue
preference to results with a housenumber while disregarding other
factors like matching postcodes.
2025-07-07 12:06:06 +02:00
Sarah Hoffmann
cecdbeb7cf reduce candidates for place search 2025-07-07 12:03:56 +02:00
Sarah Hoffmann
c634e9fc5f differentiate between place searches with and without address 2025-07-07 12:03:56 +02:00
Sarah Hoffmann
13eaea8aae split place search into address search and named search
The presence/absence of houenumbers makes quite a difference for search.
2025-07-07 09:13:48 +02:00
Sarah Hoffmann
ab5f348a4a Merge pull request #3769 from lonvia/refactor-api-searches
Refactor code around creating SQL for serach queries
2025-07-02 20:08:11 +02:00
Sarah Hoffmann
11d624e92a split db_searches moving each class in its own file 2025-07-01 22:57:04 +02:00
Sarah Hoffmann
a7797f8b37 Merge pull request #3765 from lonvia/update-ui-docs
Update instructions for UI integration
2025-06-27 20:01:28 +02:00
Sarah Hoffmann
c4dd0d4f95 update instructions for UI integration
Switches from defaulting to forwarding to UI to only forwarding
when requested. This avoids issues with auto-forwarding illegal URLs.
Also adapts to the much simplified nginx configuration.
2025-06-27 11:22:28 +02:00
Sarah Hoffmann
f43fec0d57 Merge pull request #3764 from lonvia/update-importance
'refresh --importance' also needs to refresh importances in search_name table
2025-06-27 10:02:18 +02:00
Sarah Hoffmann
af82c3debb remove duplicated test
There is a more extensive test of recompute_importance with
result check in test_refresh_wiki_data.py
2025-06-26 22:35:38 +02:00
Sarah Hoffmann
1ab4d445ea Merge pull request #3762 from lonvia/remove-gazetteer-output-support
Remove support for deprecated gazetteer osm2pgsql output
2025-06-26 20:28:16 +02:00
Sarah Hoffmann
678702ceb7 rewrite importances in search_name after updating in placex 2025-06-26 20:27:37 +02:00
Sarah Hoffmann
f9eb93c4ab remove support for deprecated gazetteer osm2pgsql output 2025-06-25 23:09:08 +02:00
Sarah Hoffmann
f97a0a76f2 Merge pull request #3747 from anqixxx/fix-special-phrases-filtering
Special Phrases Filtering: Add Command Line Functionality
2025-06-06 21:37:17 +02:00
anqixxx
cf9b946eba Added skip for when min =0 2025-06-05 09:25:14 +08:00
anqixxx
7dc3924a3c Added default min = 0 argument for private functions
empty
2025-06-04 01:12:36 -07:00
anqixxx
20cf4b56b9 Refactored min and associated tests to follow greater than or equal to logic, so that min=0 accounted for no filtering
r
2025-06-04 00:53:52 -07:00
anqixxx
40d5b78eb8 Added command line (default 0) min argument for minimum filtering, updated args.py to reflect this 2025-06-04 00:53:52 -07:00
Sarah Hoffmann
8d0e767826 Merge pull request #3748 from lonvia/airports
Improve finding airports by their codes
2025-06-02 14:39:02 +02:00
Sarah Hoffmann
87a8c246a0 improve result cutting when a POI comes out with top importance 2025-06-01 12:00:36 +02:00
Sarah Hoffmann
90050de717 only rerank results if there is more than one
With one result order is obvious.
2025-06-01 11:55:27 +02:00
Sarah Hoffmann
10a7d1106d reduce influence of query rematching a little bit 2025-06-01 11:54:21 +02:00
Sarah Hoffmann
f2236f68f1 when rematching only distinguish between perfect, somewhat and bad match 2025-06-01 11:53:23 +02:00
Sarah Hoffmann
831fccdaee add FAA codes (US version of IATA codes) for airports 2025-06-01 11:49:55 +02:00
Sarah Hoffmann
d2e691b63f work around bogus type error in latest starlette 2025-05-31 09:43:48 +02:00
Sarah Hoffmann
2a508b6c99 fix missing optional return 2025-05-30 12:03:00 +02:00
Sarah Hoffmann
02c3a6fffa Merge pull request #3744 from lonvia/add-unnamed-cemetries
Include unnamed cemetaries in POIs
2025-05-28 11:51:23 +02:00
Sarah Hoffmann
26348764d4 add landuse=cemetery as POI even when unnamed 2025-05-28 09:48:08 +02:00
Sarah Hoffmann
f8a56ab6e6 Merge pull request #3742 from lonvia/korean-defaults
Remove English as default language for South Korea
2025-05-26 14:13:54 +02:00
Sarah Hoffmann
75b4c7e56b adapt to changed loop handling of pytest_asyncio 2025-05-26 11:51:20 +02:00
Sarah Hoffmann
9f1dfb1876 remove English as default language for South Korea 2025-05-26 10:28:14 +02:00
Sarah Hoffmann
730b4204f6 Merge pull request #3741 from dave-meyer/patch-1
docs: Added missing code span for search API parameter value
2025-05-26 09:21:40 +02:00
Dave Meyer
4898704b5a docs: Added missing code span for search API parameter value 2025-05-25 20:42:09 +02:00
Sarah Hoffmann
0cf470f863 Merge pull request #3710 from anqixxx/fix-special-phrases-filtering
Fix special phrases filtering
2025-05-21 21:34:28 +02:00
anqixxx
6220bde2d6 Added mypy ignore fix for logging.py (library change), as well as quick mac fix on mem.cached 2025-05-21 11:11:56 -07:00
Sarah Hoffmann
a4d3b57f37 Merge pull request #3709 from anqixxx/update-readme
Improve README formatting and add install steps
2025-05-21 19:49:12 +02:00
anqixxx
618fbc63d7 Added testing to test get classtype pairs in import special phrases 2025-05-21 10:39:51 -07:00
anqixxx
3f51cb3fd1 Made the limit configurable with an optional argument, updating the testing as well to reflect this. default is now 0, meaning that it will return everything that occurs more than once. Removed mock database test, and got rid of fetch all. Rebased all tests to monkeypatch 2025-05-21 10:38:34 -07:00
anqixxx
59a947c5f5 Removed class type pair getter that used style sheets from both spi_importer and the associated testing function 2025-05-21 10:38:08 -07:00
anqixxx
1952290359 Removed magic mocking, using monkeypatch instead, and using a placex table to simulate a 'real database' 2025-05-21 10:37:42 -07:00
anqixxx
1a323165f9 Filter special phrases by style and frequency to fix #235 2025-05-21 10:36:46 -07:00
anqixxx
9c2fdf5eae Improve README formatting and add install steps, adding a general cloning step before the virtual environment. This would have been helpful for me during Nominatim setup 2025-05-21 10:14:36 -07:00
Sarah Hoffmann
800c56642b tweak full count cut-off (as per deployment on osm.org) 2025-05-11 11:48:07 +02:00
Sarah Hoffmann
b51fed025c Merge pull request #3732 from lonvia/exclude-country-from-direction-penalty
Exclude address searches with country from direction penalty
2025-04-30 10:45:37 +02:00
Sarah Hoffmann
34b72591cc exclude address searches with country from direction penalty
Countries are not adequately represented by partial term counts.
2025-04-29 17:37:31 +02:00
Sarah Hoffmann
bc450d110c Merge pull request #3722 from emmanuel-ferdman/master
resolve datetime deprecation warnings
2025-04-22 14:21:05 +02:00
Sarah Hoffmann
388acf4727 Merge pull request #3726 from lonvia/revert-json-format-change
Revert accidental change in json output format
2025-04-18 14:43:51 +02:00
Sarah Hoffmann
3999977941 revert accidental change in json output format 2025-04-18 12:05:25 +02:00
Emmanuel Ferdman
df58870e3f resolve datetime deprecation warnings
Signed-off-by: Emmanuel Ferdman <emmanuelferdman@gmail.com>
2025-04-17 11:15:16 -07:00
Sarah Hoffmann
478a8741db Merge pull request #3719 from lonvia/query-direction
Estimate query direction
2025-04-17 15:17:56 +02:00
Sarah Hoffmann
7f710d2394 add a comment about the precomputed denominator 2025-04-15 09:38:05 +02:00
Sarah Hoffmann
06e39e42d8 add direction penalties
Direction penalties are estimated by getting the name to address
ratio usage for each partial term in the query and computing the
linear regression of that ratio over the entire phrase. Or to put
it in ither words: we try to determine if the terms at the beginning
or the end of the query are more likely to constitute a name.

Direction penalties are currently used only in classic name queries.
2025-04-11 20:41:06 +02:00
Sarah Hoffmann
2ef0e20a3f reorganise token reranking
As the reranking is about changing penalties in presence of other
tokens, change the datastructure to have the other tokens readily
avilable.
2025-04-11 13:38:34 +02:00
Sarah Hoffmann
b680d81f0a ensure that bailout-check is done after each iteration 2025-04-11 11:02:11 +02:00
Sarah Hoffmann
e0e067b1d6 replace use of range when computing word list 2025-04-11 09:59:04 +02:00
Sarah Hoffmann
3980791cfd use iterator instead of list to go over partials 2025-04-11 09:38:24 +02:00
Sarah Hoffmann
497e27bb9a move partial token into a separate field in the query struct
There is exactly one token to be expected and the token is usually
present.
2025-04-11 08:57:34 +02:00
Sarah Hoffmann
1db717b886 Merge pull request #3716 from lonvia/github-cache-osm2pgsql-binary
Github actions: cache compiled osm2pgsql binary

For the tests on Ubunutu 22-04 we need to compile osm2pgsql because the version they ship is too old. This adds caching of the compiled binary, so that we don't need to recompile for each CI run. Together with the new BDD tests that shaves around 10 min off a CI run.
2025-04-10 17:20:32 +02:00
Sarah Hoffmann
b47c8ccfb1 actions: cache compiled osm2pgsql binary 2025-04-10 16:06:27 +02:00
Sarah Hoffmann
63b055283d Merge pull request #3714 from lonvia/postcode-update-without-project-dir
Change postcode update function to work without a project directory
2025-04-10 08:51:22 +02:00
Sarah Hoffmann
b80e6914e7 Merge pull request #3715 from lonvia/demote-tags-to-fallbacks
Demote historic and tourism=attraction to fallback tags
2025-04-10 08:51:06 +02:00
Sarah Hoffmann
9d00a137fe demote historic and tourism=attraction to fallback tags 2025-04-09 20:15:18 +02:00
Sarah Hoffmann
97d9e3c548 allow updating postcodes without a project directory
Postcodes will then be updated without looking for external postcodes.
2025-04-09 20:04:01 +02:00
Sarah Hoffmann
e4180936c1 Merge pull request #3713 from lonvia/bdd-pytest-db-test
Move BDD tests to pytest-bdd
2025-04-09 19:37:30 +02:00
Sarah Hoffmann
34e0ecb44f update documentation for BDD tests 2025-04-09 15:21:50 +02:00
Sarah Hoffmann
d95e9737da remove usage of behave 2025-04-09 14:57:39 +02:00
Sarah Hoffmann
b34991d85f add BDD tests for DB 2025-04-09 14:52:34 +02:00
Sarah Hoffmann
5f44aa2873 improve table comparison 2025-04-04 11:02:51 +02:00
Sarah Hoffmann
dae643c040 move database setup to generic conftest.py 2025-04-04 11:02:51 +02:00
Sarah Hoffmann
ee62d5e1cf remove old behave osm2pgsql BDD tests 2025-04-04 11:02:51 +02:00
Sarah Hoffmann
fb440f29a2 implement BDD osm2pgsql tests with pytest-bdd 2025-04-04 11:02:51 +02:00
Sarah Hoffmann
0f725b1880 enable python-bdd for github actions 2025-04-04 11:02:51 +02:00
Sarah Hoffmann
39f56ba4b8 restrict coordinate output to 7 digits 2025-04-04 11:02:51 +02:00
Sarah Hoffmann
6959577aa4 replace behave BDD API tests with pytest-bdd tests 2025-04-04 11:02:51 +02:00
Sarah Hoffmann
50d4b0a386 Merge pull request #3687 from asharmalik19/test-linked-places-language
test: linked places expand default language names
2025-04-04 10:58:53 +02:00
Ashar
9ff93bdb3d Update linked places name test
Clean up test scenario by removing extra language variations and
improving table readability.
2025-04-03 14:30:18 -04:00
Ashar
e0bf553aa5 test: linked places expand default language names
Add failing test for issue #2714 to verify default language expansion
2025-04-03 14:30:18 -04:00
Sarah Hoffmann
2ce2d031fa Merge pull request #3702 from lonvia/remove-tokenizer-dir
Remove automatic setup of tokenizer directory

So far the tokenizer factory would create a directory for private data for the tokenizer and then hand in the directory location to the tokenizer.

ICU tokenizer doesn't need any extra data anymore, so it doesn't make sense to create a directory which then remains empty. If a tokenizer needs such a directory in the future, it needs to create it on its own and make sure to handle the situation correctly where no project directory is used at all.
2025-04-03 09:04:48 +02:00
Sarah Hoffmann
186f562dd7 remove automatic setup of tokenizer directory
ICU tokenizer doesn't need any extra data anymore, so it doesn't
make sense to create a directory which then remains empty. If a
tokenizer needs such a directory in the future, it needs to create
it on its own and make sure to handle the situation correctly where
no project directory is used at all.
2025-04-02 20:20:04 +02:00
Sarah Hoffmann
c5bbeb626f Merge pull request #3700 from lonvia/ignore-inherited-addresses
Ignore POIs with inherited addresses for the address layer
2025-04-02 12:00:45 +02:00
Sarah Hoffmann
3bc77629c8 ignore POIs with inherited addresses for the address layer
We know that there is a building which describes the address as a
polygon and is therefore more suitable.
2025-04-02 10:30:45 +02:00
Sarah Hoffmann
6cf1287c4e Merge pull request #3686 from astridx/output_names
Output names as setting
2025-04-01 20:16:15 +02:00
Sarah Hoffmann
a49e8b9cf7 Merge pull request #3675 from TuringVerified/generic-preprocessors
Add generic preprocessors
2025-04-01 20:14:43 +02:00
TuringVerified
2eeec46040 Remove unnecessary assert statement, Fix regex_replace docstring and simplify regex_replace 2025-04-01 18:54:30 +05:30
TuringVerified
6d5a4a20c5 Update documentation, optimise regex_replace, add tests 2025-04-01 18:54:30 +05:30
TuringVerified
4665ea3e77 Add generic preprocessor 2025-04-01 18:54:30 +05:30
Sarah Hoffmann
9cf5eee5d4 add instructions for pip package upload 2025-04-01 11:59:03 +02:00
Sarah Hoffmann
fce279226f prepare release 5.1.0 2025-04-01 10:16:35 +02:00
Sarah Hoffmann
54d895c4ce Merge pull request #3695 from TuringVerified/doc-dependencies
[Small fix] Add documentation to install extras for mkdocstrings
2025-04-01 09:34:08 +02:00
TuringVerified
896a1c9d12 Add mkdocstrings extra 2025-04-01 11:06:46 +05:30
Sarah Hoffmann
32728d6c89 Merge pull request #3693 from lonvia/remove-unused-sql
Remove SQL function for address lookup
2025-03-31 17:11:39 +02:00
astridx
12ad95067d output names as setting 2025-03-31 16:55:05 +02:00
Sarah Hoffmann
bfd1c83cb0 Merge pull request #3692 from lonvia/word-lookup-variants
Avoid matching penalty for abbreviated search terms
2025-03-31 16:38:31 +02:00
Sarah Hoffmann
bbadc62371 remove SQL function for address lookup
This is now done in Python.
2025-03-31 15:09:40 +02:00
Sarah Hoffmann
5c9d3ca8d2 Merge pull request #3691 from lonvia/more-search-tweaks
More tweaks to search wights
2025-03-31 15:06:09 +02:00
Sarah Hoffmann
be4ba370ef adapt tests to extended results 2025-03-31 14:52:50 +02:00
Sarah Hoffmann
3cb183ffb0 add lookup word to variants in word table 2025-03-31 14:52:50 +02:00
Sarah Hoffmann
58ef032a2b do not write any word counts on initial word insert 2025-03-31 14:52:50 +02:00
Sarah Hoffmann
1705bb5f57 do not save word counts of 1
This is the default setting, which will be assumed when the count is
missing.
2025-03-31 14:52:50 +02:00
Sarah Hoffmann
f2aa15778f always use lookup when requested
Doesn't seem to cause any issues in production.
2025-03-31 11:38:21 +02:00
Sarah Hoffmann
efe65c3e49 increase allowable address counts 2025-03-31 11:38:21 +02:00
Sarah Hoffmann
51847ebfeb more agressively reduce expected count for multi-word terms
Improves searching of non-latin scripts with forced token spaces.
2025-03-31 11:18:22 +02:00
Sarah Hoffmann
46579f08e4 Merge pull request #3690 from lonvia/fix-signature
Fix function signature for newer SQLAlchemy
2025-03-31 11:17:03 +02:00
Sarah Hoffmann
d4994a152b fix function signature for newer SQLAlchemy 2025-03-31 09:42:29 +02:00
Sarah Hoffmann
00b3ace3cf Merge pull request #3684 from lonvia/compact-en-variants
Clean up English variants
2025-03-24 15:15:13 +01:00
Sarah Hoffmann
522bc942cf restrict some English variants to end of word 2025-03-21 21:22:38 +01:00
Sarah Hoffmann
d6e749d621 make English variant list more compact 2025-03-21 21:13:34 +01:00
Sarah Hoffmann
13cfb7efe2 Merge pull request #3682 from lonvia/fix-postcode-case
Fix case issues when parsing postcodes
2025-03-21 11:41:24 +01:00
Sarah Hoffmann
35baf77b18 make query upper-case when parsing postcodes
The postcode patterns expect upper-case letters.
2025-03-21 09:44:15 +01:00
Sarah Hoffmann
7e68613cc7 Merge pull request #3679 from lonvia/output-fixes
Minor fixes for v1 frontend code
2025-03-19 21:56:28 +01:00
Sarah Hoffmann
b1fc721f4b fix layer setting for structured search 2025-03-19 17:31:43 +01:00
Sarah Hoffmann
d400fd5f76 fix debug output for lookup type 2025-03-19 17:31:18 +01:00
Sarah Hoffmann
e4295dba10 Merge pull request #3678 from lonvia/search-tweaks
Some minor tweaks to postcode parsing in query
2025-03-19 16:00:52 +01:00
Sarah Hoffmann
9419c5adb2 penalize postcode searches with multiple name qualifiers 2025-03-19 10:05:36 +01:00
Sarah Hoffmann
2c61fe08a0 use word_token length when penalizing against postcodes 2025-03-19 09:52:40 +01:00
Sarah Hoffmann
7b3c725f2a postcode token should have transliterated term in word_token 2025-03-19 09:52:40 +01:00
Sarah Hoffmann
edc5ada625 improve handling of leading postcodes
Setting the direction of the query while yielding assignments is
a bad idea because it may override a direction already set.
2025-03-19 09:52:40 +01:00
Sarah Hoffmann
72d3360fa2 Merge pull request #3673 from otbutz/parallel_safe
Mark functions as PARALLEL SAFE
2025-03-18 21:46:53 +01:00
Sarah Hoffmann
0ffe384c57 Merge pull request #3676 from lonvia/adjust-place-levels-sa
Adjust place ranks for Saudi-Arabia
2025-03-18 18:31:48 +01:00
Sarah Hoffmann
9dad5edeb6 adjust for special use of province and municipality in Saudi-Arabia 2025-03-18 16:38:10 +01:00
Thomas Butz
d86d491f2e Mark functions as PARALLEL SAFE 2025-03-13 10:53:11 +01:00
Sarah Hoffmann
3026c333ca adapt typing for latest SQLAlchemy version 2025-03-13 10:49:08 +01:00
Sarah Hoffmann
ad84bbdec7 Merge pull request #3671 from lonvia/remove-osm2pgsql-libdir
Remove code for setting osm2pgsql location via config.lib_dir
2025-03-11 11:22:46 +01:00
Sarah Hoffmann
f5755a7a82 remove code for setting osm2pgsql via config.lib_dir
With the internal osm2pgsql gone, configuration of the binary location
via settings is the only option left that makes sense.
2025-03-11 09:04:05 +01:00
Sarah Hoffmann
cd08956c61 Merge pull request #3670 from lonvia/flake-for-tests
Extend linting with flake to tests
2025-03-10 09:35:24 +01:00
Sarah Hoffmann
12f5719184 remove unused bdd util functions 2025-03-09 17:34:40 +01:00
Sarah Hoffmann
78f839fbd3 enable flake for bdd test code 2025-03-09 17:34:04 +01:00
Sarah Hoffmann
c70dfccaca also enable flake for tests in github actions 2025-03-09 16:03:02 +01:00
Sarah Hoffmann
4cc788f69e enable flake for Python tests 2025-03-09 15:33:24 +01:00
Sarah Hoffmann
5a245e33e0 Merge pull request #3667 from eumiro/simplify-int-float
Simplify  int/float manipulation
2025-03-09 09:44:15 +01:00
Miroslav Šedivý
6ff51712fe Simplify int/float manipulation 2025-03-06 19:26:56 +01:00
Sarah Hoffmann
c431e0e45d Merge pull request #3666 from eumiro/math-isclose
Replace custom Almost with stdlib math.isclose
2025-03-06 17:53:01 +01:00
Sarah Hoffmann
c2d62a59cb Merge pull request #3664 from eumiro/consolidate-random
Consolidate usage of random module
2025-03-06 17:52:19 +01:00
Miroslav Šedivý
cd64788a58 Replace custom Almost with stdlib math.isclose 2025-03-05 20:35:01 +01:00
Miroslav Šedivý
800a41721a Consolidate usage of random module 2025-03-05 19:38:28 +01:00
Sarah Hoffmann
1b44fe2555 Merge pull request #3665 from lonvia/pattern-matching-postcodes
Add full parsing of postcodes in query
2025-03-05 16:02:03 +01:00
Sarah Hoffmann
6b0d58d9fd restrict postcode parsing in typed phrases
Postcodes can only appear in postcode-type phrases and must then
cover the full phrase
2025-03-05 10:09:33 +01:00
Sarah Hoffmann
afb89f9c7a add unit tests for postcode parser 2025-03-04 16:25:00 +01:00
Sarah Hoffmann
6712627d5e adapt BDD tests to new postcode handling 2025-03-04 15:18:46 +01:00
Sarah Hoffmann
434fbbfd18 add support for country prefixes in postcodes 2025-03-04 15:18:27 +01:00
Sarah Hoffmann
921db8bb2f cache all info of ICUQueryAnalyser in a single object 2025-03-04 08:58:57 +01:00
Sarah Hoffmann
a574b98e4a remove postcode computation for word table during import 2025-03-04 08:57:59 +01:00
Sarah Hoffmann
b2af358f66 reenable ZIP+ test 2025-03-04 08:57:59 +01:00
Sarah Hoffmann
e67ae701ac show token begin and end in debug output 2025-03-04 08:57:59 +01:00
Sarah Hoffmann
fc1c6261ed add postcode parser 2025-03-04 08:57:37 +01:00
Sarah Hoffmann
6759edfb5d make word generation from query a class method 2025-03-04 08:57:37 +01:00
Sarah Hoffmann
e362a965e1 search: merge QueryPart array with QueryNodes
The basic information on terms is pretty much always used together
with the node inforamtion. Merging them together saves some
allocation while making lookup easier at the same time.
2025-03-04 08:57:37 +01:00
Sarah Hoffmann
eff60ba6be enable parsing of US ZIP+ codes
The four-digit part of these postcodes will simply be ignored.
2025-02-25 20:29:06 +01:00
Sarah Hoffmann
157414a053 Merge pull request #3659 from lonvia/custom-datrie-structure
Replace datrie library with a simple custom Python implementation
2025-02-24 16:49:42 +01:00
Sarah Hoffmann
18d4996bec remove datrie dependency 2025-02-24 10:24:21 +01:00
Sarah Hoffmann
13db4c9731 replace datrie library with a more simple pure-Python class 2025-02-24 10:24:21 +01:00
Sarah Hoffmann
f567ea89cc Merge pull request #3658 from lonvia/minor-query-parsing-optimisations
Minor query parsing optimisations
2025-02-24 10:16:47 +01:00
Sarah Hoffmann
3e718e40d9 adapt documentation for PhraseType type 2025-02-21 17:16:42 +01:00
Sarah Hoffmann
49bd18b048 replace PhraseType enum with simple int constants 2025-02-21 16:44:12 +01:00
Sarah Hoffmann
31412e0674 replace TokenType enum with simple char constants 2025-02-21 10:23:41 +01:00
Sarah Hoffmann
4577669213 replace BreakType enum with simple char constants 2025-02-21 09:57:48 +01:00
Sarah Hoffmann
9bf1428d81 consistently use query module as qmod 2025-02-21 09:31:21 +01:00
Sarah Hoffmann
b56edf3d0a avoid yielding when extracting words from query 2025-02-20 23:32:39 +01:00
Sarah Hoffmann
abc911079e remove word_number counting for phrases
We can just examine the break types to know if we are dealing
with a partial token.
2025-02-20 17:36:50 +01:00
Sarah Hoffmann
adabfee3be Merge pull request #3655 from lonvia/remove-name-ranking-in-postcode-search
Tweak penalties for postcode searches
2025-02-20 14:32:43 +01:00
Sarah Hoffmann
46c4446dc2 remove address penalty for postcode search
Searches of the form <postcode> <city> are in fact quite common.
2025-02-20 11:11:45 +01:00
Sarah Hoffmann
add9244a2f do not rerank address by full match in postcode search
The reranking result will not be completely correct because
the address of a postcode refer to the address _and_ name
of the parent and reranking was only done against the
address. We assume here that the postcode is precise enough
as to not require a penalty to to partial matches.
2025-02-20 10:29:03 +01:00
Sarah Hoffmann
96d7a8e8f6 Merge pull request #3653 from lonvia/trailing-spaces-in-normalization
Strip leading and trailing space markers during normalization
2025-02-19 17:25:59 +01:00
Sarah Hoffmann
55c3176957 strip normalisation results of normal and special spaces 2025-02-19 14:40:35 +01:00
Sarah Hoffmann
e29823e28f add test for structured query with leading spaces 2025-02-19 10:31:36 +01:00
Sarah Hoffmann
97ed168996 Merge pull request #3652 from lonvia/update-variants
Cleanup and updates of tokenizer variant configuration
2025-02-18 19:47:45 +01:00
Sarah Hoffmann
9b8ef97d4b Merge pull request #3649 from lonvia/actions-move-to-ubuntu22
Move Github actions to Unbuntu-22 image
2025-02-18 13:21:09 +01:00
Sarah Hoffmann
4f3c88f0c1 remove e-ë mutation, this is taken care of by transliteration 2025-02-18 10:31:44 +01:00
mhsr21
7781186f3c Add USPS Standard Suffix Abbreviation 2025-02-18 09:28:13 +01:00
Sarah Hoffmann
f78686edb8 fix Norwegian variants
More cases of 'no' being interpreted as fasle by yaml.
2025-02-18 09:28:13 +01:00
Sarah Hoffmann
e330cd3162 remove ineffective and dupicate variants 2025-02-18 09:28:13 +01:00
Sarah Hoffmann
671af4cff2 Merge pull request #3555 from IvanShift/patch-1
Fixed Russian abbreviation list
2025-02-17 18:44:11 +01:00
Sarah Hoffmann
e612b7d550 actions: use Debians's script for adding the Postgres apt repo 2025-02-17 17:56:23 +01:00
Sarah Hoffmann
0b49d01703 actions: move tests to Ubuntu-20 2025-02-17 17:54:49 +01:00
Sarah Hoffmann
f6bc8e153f Merge pull request #3648 from lonvia/extratags-for-geocodejson
Enable output of extratags for geocodejson format
2025-02-17 11:14:52 +01:00
Sarah Hoffmann
f143ecaf1c add documentation for new extra field 2025-02-17 10:04:23 +01:00
Sarah Hoffmann
6730c8bac8 add optional output of extratags to geocodejson 2025-02-16 10:16:40 +01:00
Sarah Hoffmann
ee8915f2b6 prepare 5.0.0 release 2025-02-05 10:54:38 +01:00
Sarah Hoffmann
5475bf7b9c Merge pull request #3635 from lonvia/replace-wikimedia-importance-test-data
Update wikimedia importance file for test database
2025-01-14 16:49:52 +01:00
Sarah Hoffmann
95e2d8c846 adapt tests to changed wikimedia importance test table 2025-01-14 14:19:17 +01:00
Sarah Hoffmann
7552818866 replace wikimedia importance file for test data with CSV version 2025-01-14 09:16:25 +01:00
Sarah Hoffmann
db3991af74 Merge pull request #3626 from lonvia/import-performance
Import performance
2025-01-10 16:44:33 +01:00
Sarah Hoffmann
4523b9aaed Merge pull request #3631 from lonvia/avoid-transactions
Creating tables and indexes in autocommit mode
2025-01-10 16:44:18 +01:00
Sarah Hoffmann
8b1cabebd6 Merge pull request #3633 from lonvia/restrict-long-ways
Ignore overly long ways during import
2025-01-10 16:06:37 +01:00
Sarah Hoffmann
0cf636a80c ignore overly long ways during import 2025-01-10 13:55:43 +01:00
Sarah Hoffmann
c2cb6722fe use autocommit when creating tables and indexes
Might avoid some deadlock situations with autovacuum.
2025-01-09 17:14:37 +01:00
Sarah Hoffmann
f8337bedb2 Merge pull request #3629 from lonvia/additional-breaks
Introduce new break types and phrase splitting for Japanese addresses
2025-01-09 13:55:29 +01:00
Sarah Hoffmann
efc09a5cfc add japanese phrase preprocessing
Code adapted from GSOC code by @miku.
2025-01-09 09:24:10 +01:00
Sarah Hoffmann
86ad9efa8a keep break indicators [:-] during normalisation
All punctuation will be converted to '-'. Soft breaks : may be
added by preprocessors. The break signs are only used during
query analysis and are ignored during import token analysis.
2025-01-09 09:21:55 +01:00
Sarah Hoffmann
d984100e23 add inner word break penalty 2025-01-07 21:42:25 +01:00
Sarah Hoffmann
499110f549 add SOFT_PHRASE break and enable parsing
Also enables parsing of PART breaks.
2025-01-06 17:10:24 +01:00
Sarah Hoffmann
267e5dac0d split up MultiPolygons before adding them to large_areas table 2024-12-22 09:15:16 +01:00
Sarah Hoffmann
32d3eb46d5 move geometry split into insertLocationAreaLarge()
thus insert only needs to be called once.
2024-12-22 09:15:16 +01:00
Sarah Hoffmann
c8a0dc8af1 more efficient belongs-to-address determination 2024-12-22 09:15:16 +01:00
Sarah Hoffmann
14ecfc7834 Merge pull request #3619 from lonvia/demote-farms
Remove farms and isolated dwellings from computed addresses
2024-12-22 09:13:42 +01:00
Sarah Hoffmann
cad44eb00c remove farms and isolated dwellings from computed addresses
Farms and isolated dwellings are usually confined to a very small
area. It does not make sense if they are automatically used in
addressing surrounding features. Still works to use them for
parenting when used with addr:place.
2024-12-20 22:59:02 +01:00
Sarah Hoffmann
f76dbb0a16 docs: update Update docs for virtualenv use 2024-12-20 11:27:45 +01:00
Sarah Hoffmann
8dd218a1d0 Merge pull request #3618 from osm-search/settings-md-table-space-osm-index
Settings.md - one setting was repeated
2024-12-19 08:40:31 +01:00
IvanShift
bea9249e38 Added "дом" and fixed order "школа" 2024-10-06 17:59:59 +03:00
Alexander Sapozhnikov
1e4677b668 Expand Russian abbreviation list 2022-11-01 04:01:27 +05:00
Alexander Sapozhnikov
7f909dbbd8 Add replacement for Russian 2022-11-01 02:54:07 +05:00
575 changed files with 15183 additions and 12058 deletions

View File

@@ -6,3 +6,6 @@ extend-ignore =
E711
per-file-ignores =
__init__.py: F401
test/python/utils/test_json_writer.py: E131
**/conftest.py: E402
test/bdd/*: F821

12
.github/PULL_REQUEST_TEMPLATE.md vendored Normal file
View File

@@ -0,0 +1,12 @@
## Summary
<!-- Describe the purpose of your pull request and, if present, link to existing issues. -->
## AI usage
<!-- Please list where and to what extent AI was used. -->
## Contributor guidelines (mandatory)
<!-- We only accept pull requests that follow our guidelines. A deliberate violation may result in a ban. -->
- [ ] I have adhered to the [coding style](https://github.com/osm-search/Nominatim/blob/master/CONTRIBUTING.md#coding-style)
- [ ] I have [tested](https://github.com/osm-search/Nominatim/blob/master/CONTRIBUTING.md#testing) the proposed changes
- [ ] I have [disclosed](https://github.com/osm-search/Nominatim/blob/master/CONTRIBUTING.md#using-ai-assisted-code-generators) above any use of AI to generate code, documentation, or the pull request description

View File

@@ -22,7 +22,7 @@ runs:
- name: Install prerequisites from apt
run: |
sudo apt-get install -y -qq python3-icu python3-datrie python3-jinja2 python3-psutil python3-dotenv python3-yaml python3-sqlalchemy python3-psycopg python3-asyncpg
sudo apt-get install -y -qq python3-icu python3-datrie python3-jinja2 python3-psutil python3-dotenv python3-yaml python3-sqlalchemy python3-psycopg python3-asyncpg python3-mwparserfromhell
shell: bash
if: inputs.dependencies == 'apt'

View File

@@ -11,10 +11,8 @@ runs:
steps:
- name: Remove existing PostgreSQL
run: |
sudo /usr/share/postgresql-common/pgdg/apt.postgresql.org.sh -y
sudo apt-get purge -yq postgresql*
sudo apt install curl ca-certificates gnupg
curl https://www.postgresql.org/media/keys/ACCC4CF8.asc | gpg --dearmor | sudo tee /etc/apt/trusted.gpg.d/apt.postgresql.org.gpg >/dev/null
sudo sh -c 'echo "deb https://apt.postgresql.org/pub/repos/apt $(lsb_release -cs)-pgdg main" > /etc/apt/sources.list.d/pgdg.list'
sudo apt-get update -qq
shell: bash

View File

@@ -37,18 +37,20 @@ jobs:
needs: create-archive
strategy:
matrix:
flavour: ["ubuntu-20", "ubuntu-24"]
flavour: ["ubuntu-22", "ubuntu-24"]
include:
- flavour: ubuntu-20
ubuntu: 20
- flavour: ubuntu-22
ubuntu: 22
postgresql: 12
lua: '5.1'
dependencies: pip
python: '3.9'
- flavour: ubuntu-24
ubuntu: 24
postgresql: 17
postgresql: 18
lua: '5.3'
dependencies: apt
python: 'builtin'
runs-on: ubuntu-${{ matrix.ubuntu }}.04
@@ -68,26 +70,40 @@ jobs:
with:
dependencies: ${{ matrix.dependencies }}
- uses: actions/cache@v4
with:
path: |
/usr/local/bin/osm2pgsql
key: osm2pgsql-bin-22-1
if: matrix.ubuntu == '22'
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python }}
if: matrix.python != 'builtin'
- name: Compile osm2pgsql
run: |
sudo apt-get install -y -qq libboost-system-dev libboost-filesystem-dev libexpat1-dev zlib1g-dev libbz2-dev libpq-dev libproj-dev libicu-dev liblua${LUA_VERSION}-dev lua-dkjson nlohmann-json3-dev
mkdir osm2pgsql-build
cd osm2pgsql-build
git clone https://github.com/osm2pgsql-dev/osm2pgsql
mkdir build
cd build
cmake ../osm2pgsql
make
sudo make install
cd ../..
rm -rf osm2pgsql-build
if: matrix.ubuntu == '20'
if [ ! -f /usr/local/bin/osm2pgsql ]; then
sudo apt-get install -y -qq libboost-system-dev libboost-filesystem-dev libexpat1-dev zlib1g-dev libbz2-dev libpq-dev libproj-dev libicu-dev liblua${LUA_VERSION}-dev lua-dkjson nlohmann-json3-dev
mkdir osm2pgsql-build
cd osm2pgsql-build
git clone https://github.com/osm2pgsql-dev/osm2pgsql
mkdir build
cd build
cmake ../osm2pgsql
make
sudo make install
cd ../..
rm -rf osm2pgsql-build
else
sudo apt-get install -y -qq libexpat1 liblua${LUA_VERSION}
fi
if: matrix.ubuntu == '22'
env:
LUA_VERSION: ${{ matrix.lua }}
- name: Install test prerequisites
run: ./venv/bin/pip install behave==1.2.6
- name: Install test prerequisites (apt)
run: sudo apt-get install -y -qq python3-pytest python3-pytest-asyncio uvicorn python3-falcon python3-aiosqlite python3-pyosmium
if: matrix.dependencies == 'apt'
@@ -96,11 +112,14 @@ jobs:
run: ./venv/bin/pip install pytest-asyncio falcon starlette asgi_lifespan aiosqlite osmium uvicorn
if: matrix.dependencies == 'pip'
- name: Install test prerequisites
run: ./venv/bin/pip install pytest-bdd
- name: Install latest flake8
run: ./venv/bin/pip install -U flake8
- name: Python linting
run: ../venv/bin/python -m flake8 src
run: ../venv/bin/python -m flake8 src test/python test/bdd
working-directory: Nominatim
- name: Install mypy and typechecking info
@@ -108,7 +127,7 @@ jobs:
if: matrix.dependencies == 'pip'
- name: Python static typechecking
run: ../venv/bin/python -m mypy --strict --python-version 3.8 src
run: ../venv/bin/python -m mypy --strict --python-version 3.9 src
working-directory: Nominatim
if: matrix.dependencies == 'pip'
@@ -118,8 +137,8 @@ jobs:
- name: BDD tests
run: |
../../../venv/bin/python -m behave -DREMOVE_TEMPLATE=1 --format=progress3
working-directory: Nominatim/test/bdd
../venv/bin/python -m pytest test/bdd --nominatim-purge
working-directory: Nominatim
install:
runs-on: ubuntu-latest
@@ -307,7 +326,7 @@ jobs:
- uses: ./Nominatim/.github/actions/setup-postgresql
with:
postgresql-version: 17
postgresql-version: 18
- name: Install Python dependencies
run: |

View File

@@ -87,7 +87,6 @@ Checklist for releases:
* [ ] increase versions in
* `src/nominatim_api/version.py`
* `src/nominatim_db/version.py`
* CMakeLists.txt
* [ ] update `ChangeLog` (copy information from patch releases from release branch)
* [ ] complete `docs/admin/Migration.md`
* [ ] update EOL dates in `SECURITY.md`
@@ -114,3 +113,5 @@ Checklist for releases:
* run `nominatim --version` to confirm correct version
* [ ] tag new release and add a release on github.com
* [ ] build pip packages and upload to pypi
* `make build`
* `twine upload dist/*`

View File

@@ -1,3 +1,83 @@
5.2.0
* increase minimum required Python to 3.9
* index and output entrances of buildings and areas (thanks @emlove)
* name tags used for creating display names are now configurable
(thanks @astridx)
* new pattern-replacement query preprocessor (thanks @TuringVerified)
* special phrases can now be filtered by presence of tags (thanks @anqixxx)
* lua import style now always includes tags required by Nominatim
* improved query time reporting and logging
* improve word matching for languages with no word boundaries
* POIs with addresses inherited from surrounding building are no
longer returned in the address layer
* avoid creating a directory for the tokenizer when not needed
* replace behave with pytest-bdd for BDD testing
* refactoring and performance improvements to query parsing
* various smaller updates to styles
* remove English as default language for South Korea
* remove Japanese word variants
* updated country names for Norwegians (thanks @Johannes-Andersen)
* remove support for deprecated osm2pgsql gazetteer style
* fix updating of importances (also needs to update search_name table)
* fix query for deletable endpoint to use index again
* fix reindexing of contained places when a boundary is deleted and reinstated
* fix difference computation error when updating postcodes
* bracket handling sanitizer no longer strips bracket terms in the middle of
name
* reduce precision of stored coordinates to 7-digits everywhere
* avoid ST_Relate as it seems buggy on some systems
* remove setting for logging queries in DB, no longer functional
* postcode updates no longer require a project directory (needed for tests)
* refactor locale handling code (thanks @anqixxx)
* code updates for newer Python (thanks @emmanuel-ferdman)
* better test coverage (thanks @asharmalik19)
* various fixes and improvements to documentation
(thanks @anqixxx, @dave-meyer, @hasandiwan)
5.1.0
* replace datrie with simple internal trie implementation
* add pattern-based postcode parser for queries,
postcodes no longer need to be present in OSM to be found
* take variants into account when computing token similarity
* add extratags output to geocodejson format
* fix default layer setting used for structured queries
* update abbreviation lists for Russian and English
(thanks @shoorick, @IvanShift, @mhsrn21)
* fix variant generation for Norwegian
* fix normalization around space-like characters
* improve postcode search and handling of postcodes in queries
* reorganise internal query structure and get rid of slow enums
* enable code linting for tests
* various code moderinsations in test code (thanks @eumiro)
* remove setting osm2pgsql location via config.lib_dir
* make SQL functions parallel save as far as possible (thanks @otbutz)
* various fixes and improvements to documentation (thanks @TuringVerified)
5.0.0
* increase required versions for PostgreSQL (12+), PostGIS (3.0+)
* remove installation via cmake and debundle osm2pgsql
* remove deprecated PHP frontend
* remove deprecated legacy tokenizer
* add configurable pre-processing of queries
* add query pre-processor to split up Japanese addresses
* rewrite of osm2pgsql style implementation
(also adds support for osm2pgsql-themepark)
* reduce the number of SQL queries needed to complete a 'lookup' call
* improve computation of centroid for lines with only two points
* improve bbox output for postcode areas
* improve result order by returning the largest object when other things are
equal
* add fallback for reverse geocoding to default country tables
* exclude postcode areas from reverse geocoding
* disable search endpoint when database is reverse-only (regression)
* minor performance improvements to area split algorithm
* switch table and index creation to use autocommit mode to avoid deadlocks
* drop overly long ways during import
* restrict automatic migrations to versions 4.3+
* switch linting from pylint to flake8
* switch tests to use a wikimedia test file in the new CSV style
* various fixes and improvements to documentation
4.5.0
* allow building Nominatim as a pip package
* make osm2pgsql building optional

View File

@@ -18,16 +18,16 @@ build-api:
tests: mypy lint pytest bdd
mypy:
mypy --strict --python-version 3.8 src
mypy --strict --python-version 3.9 src
pytest:
pytest test/python
lint:
flake8 src
flake8 src test/python test/bdd
bdd:
cd test/bdd; behave -DREMOVE_TEMPLATE=1
pytest test/bdd --nominatim-purge
# Documentation

View File

@@ -27,18 +27,25 @@ can be found at nominatim.org as well.
A quick summary of the necessary steps:
1. Create a Python virtualenv and install the packages:
1. Clone this git repository and download the country grid
git clone https://github.com/osm-search/Nominatim.git
wget -O Nominatim/data/country_osm_grid.sql.gz https://nominatim.org/data/country_grid.sql.gz
2. Create a Python virtualenv and install the packages:
python3 -m venv nominatim-venv
./nominatim-venv/bin/pip install packaging/nominatim-{api,db}
2. Create a project directory, get OSM data and import:
3. Create a project directory, get OSM data and import:
mkdir nominatim-project
cd nominatim-project
../nominatim-venv/bin/nominatim import --osm-file <your planet file>
../nominatim-venv/bin/nominatim import --osm-file <your planet file> 2>&1 | tee setup.log
3. Start the webserver:
4. Start the webserver:
./nominatim-venv/bin/pip install uvicorn falcon
../nominatim-venv/bin/nominatim serve

View File

@@ -9,10 +9,11 @@ versions.
| Version | End of support for security updates |
| ------- | ----------------------------------- |
| 5.2.x | 2027-10-29 |
| 5.1.x | 2027-04-01 |
| 5.0.x | 2027-02-06 |
| 4.5.x | 2026-09-12 |
| 4.4.x | 2026-03-07 |
| 4.3.x | 2025-09-07 |
| 4.2.x | 2024-11-24 |
## Reporting a Vulnerability
@@ -31,8 +32,7 @@ description of the nature and severity of the issue. **
Patches for identified security issues are applied to all affected versions and
new minor versions are released. At the same time we release a statement at
the [Nominatim blog](https://nominatim.org/blog/) describing the nature of the
incident. Announcements will also be published at the
[geocoding mailinglist](https://lists.openstreetmap.org/listinfo/geocoding).
incident.
## List of Previous Incidents

View File

@@ -27,7 +27,7 @@ For running Nominatim:
* [PostgreSQL](https://www.postgresql.org) (12+ will work, 13+ strongly recommended)
* [PostGIS](https://postgis.net) (3.0+ will work, 3.2+ strongly recommended)
* [osm2pgsql](https://osm2pgsql.org) (1.8+)
* [Python 3](https://www.python.org/) (3.7+)
* [Python 3](https://www.python.org/) (3.9+)
Furthermore the following Python libraries are required:
@@ -37,7 +37,7 @@ Furthermore the following Python libraries are required:
* [Jinja2](https://palletsprojects.com/p/jinja/)
* [PyICU](https://pypi.org/project/PyICU/)
* [PyYaml](https://pyyaml.org/) (5.1+)
* [datrie](https://github.com/pytries/datrie)
* [mwparserfromhell](https://github.com/earwig/mwparserfromhell/)
These will be installed automatically when using pip installation.

View File

@@ -9,19 +9,27 @@ the following steps:
* Update the frontend: `pip install -U nominatim-api`
* (optionally) Restart updates
If you are still using CMake for the installation of Nominatim, then you
need to update the software in one step before migrating the database.
It is not recommended to do this while the machine is serving requests.
Below you find additional migrations and hints about other structural and
breaking changes. **Please read them before running the migration.**
!!! note
If you are migrating from a version <4.3, you need to install 4.3
first and migrate to 4.3 first. Then you can migrate to the current
and migrate to 4.3 first. Then you can migrate to the current
version. It is strongly recommended to do a reimport instead.
## 4.5.0 -> master
## 5.1.0 -> 5.2.0
### Lua import style: required extratags removed
Tags that are required by Nominatim as extratags are now always included
independent of what is defined in the style. The line
flex.add_for_extratags('required')
is no longer required in custom styles and will throw an error. Simply
remove the line from your style.
## 4.5.0 -> 5.0.0
### PHP frontend removed
@@ -33,6 +41,42 @@ needed. It currently omits a warning and does otherwise nothing. It will be
removed in later versions of Nominatim. So make sure you remove it from your
scripts.
### CMake building removed
Nominatim can now only be installed via pip. Please follow the installation
instructions for the current version to change to pip.
### osm2pgsql no longer vendored in
Nominatim no longer ships its own version of osm2pgsql. Please install a
stock version of osm2pgsql from your distribution. See the
[installation instruction for osm2pgsql](https://osm2pgsql.org/doc/install.html)
for details. A minimum version of 1.8 is required. The current stable versions
of Ubuntu and Debian already ship with an appropriate versions. For older
installation, you may have to compile a newer osm2pgsql yourself.
### Legacy tokenizer removed
The `legacy` tokenizer is no longer enabled. This tokenizer has been superseded
by the `ICU` tokenizer a long time ago. In the unlikely case that your database
still uses the `legacy` tokenizer, you must reimport your database.
### osm2pgsql style overhauled
There are some fundamental changes to how customized osm2pgsql styles should
be written. The changes are mostly backwards compatible, i.e. custom styles
should still work with the new implementation. The only exception is a
customization of the `process_tags()` function. This function is no longer
considered public and neither are the helper functions used in it.
They currently still work but will be removed at some point. If you have
been making changes to `process_tags`, please review your style and try
to switch to the new convenience functions.
For more information on the changes, see the
[pull request](https://github.com/osm-search/Nominatim/pull/3615)
and read the new
[customization documentation](https://nominatim.org/release-docs/latest/customize/Import-Styles/).
## 4.4.0 -> 4.5.0
### New structure for Python packages

View File

@@ -36,11 +36,11 @@ The website is now available at `http://localhost:8765`.
## Forwarding searches to nominatim-ui
Nominatim used to provide the search interface directly by itself when
`format=html` was requested. For all endpoints except for `/reverse` and
`/lookup` this even used to be the default.
`format=html` was requested. For the `/search` endpoint this even used
to be the default.
The following section describes how to set up Apache or nginx, so that your
users are forwarded to nominatim-ui when they go to URL that formerly presented
users are forwarded to nominatim-ui when they go to a URL that formerly presented
the UI.
### Setting up forwarding in Nginx
@@ -73,41 +73,28 @@ map $args $format {
# Determine from the URI and the format parameter above if forwarding is needed.
map $uri/$format $forward_to_ui {
default 1; # The default is to forward.
~^/ui 0; # If the URI point to the UI already, we are done.
~/other$ 0; # An explicit non-html format parameter. No forwarding.
~/reverse.*/default 0; # Reverse and lookup assume xml format when
~/lookup.*/default 0; # no format parameter is given. No forwarding.
default 0; # no forwarding by default
~/search.*/default 1; # Use this line only, if search should go to UI by default.
~/reverse.*/html 1; # Forward API calls that UI supports, when
~/status.*/html 1; # format=html is explicitly requested.
~/search.*/html 1;
~/details.*/html 1;
}
```
The `$forward_to_ui` parameter can now be used to conditionally forward the
calls:
```
# When no endpoint is given, default to search.
# Need to add a rewrite so that the rewrite rules below catch it correctly.
rewrite ^/$ /search;
location @php {
# fastcgi stuff..
``` nginx
location / {
if ($forward_to_ui) {
rewrite ^(/[^/]*) https://yourserver.com/ui$1.html redirect;
rewrite ^(/[^/.]*) https://$http_host/ui$1.html redirect;
}
}
location ~ [^/]\.php(/|$) {
# fastcgi stuff..
if ($forward_to_ui) {
rewrite (.*).php https://yourserver.com/ui$1.html redirect;
}
# proxy_pass commands
}
```
!!! warning
Be aware that the rewrite commands are slightly different for URIs with and
without the .php suffix.
Reload nginx and the UI should be available.
### Setting up forwarding in Apache
@@ -159,18 +146,16 @@ directory like this:
RewriteBase "/nominatim/"
# If no endpoint is given, then use search.
RewriteRule ^(/|$) "search.php"
RewriteRule ^(/|$) "search"
# If format-html is explicitly requested, forward to the UI.
RewriteCond %{QUERY_STRING} "format=html"
RewriteRule ^([^/]+)(.php)? ui/$1.html [R,END]
RewriteRule ^([^/.]+) ui/$1.html [R,END]
# If no format parameter is there then forward anything
# but /reverse and /lookup to the UI.
# Optionally: if no format parameter is there then forward /search.
RewriteCond %{QUERY_STRING} "!format="
RewriteCond %{REQUEST_URI} "!/lookup"
RewriteCond %{REQUEST_URI} "!/reverse"
RewriteRule ^([^/]+)(.php)? ui/$1.html [R,END]
RewriteCond %{REQUEST_URI} "/search"
RewriteRule ^([^/.]+) ui/$1.html [R,END]
</Directory>
```

View File

@@ -68,10 +68,10 @@ the update interval no new data has been published yet, it will go to sleep
until the next expected update and only then attempt to download the next batch.
The one-time mode is particularly useful if you want to run updates continuously
but need to schedule other work in between updates. For example, the main
service at osm.org uses it, to regularly recompute postcodes -- a process that
must not be run while updates are in progress. Its update script
looks like this:
but need to schedule other work in between updates. For example, you might
want to regularly recompute postcodes -- a process that
must not be run while updates are in progress. An update script refreshing
postcodes regularly might look like this:
```sh
#!/bin/bash
@@ -109,17 +109,19 @@ Unit=nominatim-updates.service
WantedBy=multi-user.target
```
And then a similar service definition: `/etc/systemd/system/nominatim-updates.service`:
`OnUnitActiveSec` defines how often the individual update command is run.
Then add a service definition for the timer in `/etc/systemd/system/nominatim-updates.service`:
```
[Unit]
Description=Single updates of Nominatim
[Service]
WorkingDirectory=/srv/nominatim
ExecStart=nominatim replication --once
StandardOutput=append:/var/log/nominatim-updates.log
StandardError=append:/var/log/nominatim-updates.error.log
WorkingDirectory=/srv/nominatim-project
ExecStart=/srv/nominatim-venv/bin/nominatim replication --once
StandardOutput=journald
StandardError=inherit
User=nominatim
Group=nominatim
Type=simple
@@ -128,9 +130,9 @@ Type=simple
WantedBy=multi-user.target
```
Replace the `WorkingDirectory` with your project directory. Also adapt user and
group names as required. `OnUnitActiveSec` defines how often the individual
update command is run.
Replace the `WorkingDirectory` with your project directory. `ExecStart` points
to the nominatim binary that was installed in your virtualenv earlier.
Finally, you might need to adapt user and group names as required.
Now activate the service and start the updates:
@@ -140,12 +142,13 @@ sudo systemctl enable nominatim-updates.timer
sudo systemctl start nominatim-updates.timer
```
You can stop future data updates, while allowing any current, in-progress
You can stop future data updates while allowing any current, in-progress
update steps to finish, by running `sudo systemctl stop
nominatim-updates.timer` and waiting until `nominatim-updates.service` isn't
running (`sudo systemctl is-active nominatim-updates.service`). Current output
from the update can be seen like above (`systemctl status
nominatim-updates.service`).
running (`sudo systemctl is-active nominatim-updates.service`).
To check the output from the update process, use journalctl: `journalctl -u
nominatim-updates.service`
#### Catch-up mode
@@ -155,13 +158,13 @@ all changes from the server until the database is up-to-date. The catch-up mode
still respects the parameter `NOMINATIM_REPLICATION_MAX_DIFF`. It downloads and
applies the changes in appropriate batches until all is done.
The catch-up mode is foremost useful to bring the database up to speed after the
The catch-up mode is foremost useful to bring the database up to date after the
initial import. Give that the service usually is not in production at this
point, you can temporarily be a bit more generous with the batch size and
number of threads you use for the updates by running catch-up like this:
```
cd /srv/nominatim
cd /srv/nominatim-project
NOMINATIM_REPLICATION_MAX_DIFF=5000 nominatim replication --catch-up --threads 15
```
@@ -173,13 +176,13 @@ replication catch-up at whatever interval you desire.
When running scheduled updates with catch-up, it is a good idea to choose
a replication source with an update frequency that is an order of magnitude
lower. For example, if you want to update once a day, use an hourly updated
source. This makes sure that you don't miss an entire day of updates when
source. This ensures that you don't miss an entire day of updates when
the source is unexpectedly late to publish its update.
If you want to use the source with the same update frequency (e.g. a daily
updated source with daily updates), use the
continuous update mode. It ensures to re-request the newest update until it
is published.
once mode together with a frequently run systemd script as described above.
It ensures to re-request the newest update until they have been published.
#### Continuous updates
@@ -197,36 +200,3 @@ parameters:
The update application keeps running forever and retrieves and applies
new updates from the server as they are published.
You can run this command as a simple systemd service. Create a service
description like that in `/etc/systemd/system/nominatim-updates.service`:
```
[Unit]
Description=Continuous updates of Nominatim
[Service]
WorkingDirectory=/srv/nominatim
ExecStart=nominatim replication
StandardOutput=append:/var/log/nominatim-updates.log
StandardError=append:/var/log/nominatim-updates.error.log
User=nominatim
Group=nominatim
Type=simple
[Install]
WantedBy=multi-user.target
```
Replace the `WorkingDirectory` with your project directory. Also adapt user
and group names as required.
Now activate the service and start the updates:
```
sudo systemctl daemon-reload
sudo systemctl enable nominatim-updates
sudo systemctl start nominatim-updates
```

View File

@@ -105,6 +105,13 @@ grouped by type.
Include geometry of result.
| Parameter | Value | Default |
|-----------| ----- | ------- |
| entrances | 0 or 1 | 0 |
When set to 1, include the tagged entrances in the result.
### Language of results
| Parameter | Value | Default |

View File

@@ -49,7 +49,7 @@ Only has an effect for JSON output formats.
| Parameter | Value | Default |
|-----------| ----- | ------- |
| addressdetails | 0 or 1 | 0 |
| addressdetails | 0 or 1 | 1 |
When set to 1, include a breakdown of the address into elements.
The exact content of the address breakdown depends on the output format.
@@ -77,6 +77,12 @@ that is available in the database, e.g. wikipedia link, opening hours.
When set to 1, include a full list of names for the result. These may include
language variants, older names, references and brand.
| Parameter | Value | Default |
|-----------| ----- | ------- |
| entrances | 0 or 1 | 0 |
When set to 1, include the tagged entrances in the result.
### Language of results

View File

@@ -60,6 +60,8 @@ The possible fields are:
* `namedetails` - dictionary with full list of available names including ref etc.
* `geojson`, `svg`, `geotext`, `geokml` - full geometry
(only with the appropriate `polygon_*` parameter)
* `entrances` - array of objects representing tagged entrances for the object, or
null if none are found (only with `entrances=1`)
## JSONv2
@@ -87,6 +89,8 @@ The properties object has the following fields:
* `extratags` - dictionary with additional useful tags like `website` or `maxspeed`
(only with `extratags=1`)
* `namedetails` - dictionary with full list of available names including ref etc.
* `entrances` - array of objects representing tagged entrances for the object, or
null if none are found (only with `entrances=1`)
Use `polygon_geojson` to output the full geometry of the object instead
of the centroid.
@@ -106,8 +110,13 @@ The following feature attributes are implemented:
* `name` - localised name of the place
* `housenumber`, `street`, `locality`, `district`, `postcode`, `city`,
`county`, `state`, `country` -
provided when it can be determined from the address
provided when it can be determined from the address (only with `addressdetails=1`)
* `admin` - list of localised names of administrative boundaries (only with `addressdetails=1`)
* `extra` - dictionary with additional useful tags like `website` or `maxspeed`
(only with `extratags=1`)
* `entrances` - array of objects representing tagged entrances for the object, or
null if none are found (only with `entrances=1`)
Use `polygon_geojson` to output the full geometry of the object instead
of the centroid.
@@ -159,8 +168,8 @@ The place information can be found in the `result` element. The attributes of th
The full address of the result can be found in the content of the
`result` element as a comma-separated list.
Additional information requested with `addressdetails=1`, `extratags=1` and
`namedetails=1` can be found in extra elements.
Additional information requested with `addressdetails=1`, `extratags=1`,
`namedetails=1`, and `entrances=1` can be found in extra elements.
### Search and Lookup
@@ -221,9 +230,9 @@ be more than one. The attributes of that element contain:
When `addressdetails=1` is requested, the localised address parts appear
as subelements with the type of the address part.
Additional information requested with `extratags=1` and `namedetails=1` can
be found in extra elements as sub-element of `extratags` and `namedetails`
respectively.
Additional information requested with `extratags=1`, `namedetails=1`, and
`entrances=1` can be found in extra elements as sub-element of `extratags`,
`namedetails`, and `entrances` respectively.
## Notes on field values
@@ -300,3 +309,78 @@ with a designation label. Per default the following labels may appear:
They roughly correspond to the classification of the OpenStreetMap data
according to either the `place` tag or the main key of the object.
### entrances
Entrance details in the xml and json formats return the latitude and longitude
of the entrance, the osm node ID, the [type of
entrance](https://wiki.openstreetmap.org/wiki/Key:entrance), and any extra tags
associated with the entrance node.
* osm_id
* type
* lat
* lon
* extratags
They roughly correspond to the classification of the OpenStreetMap data
according to either the `place` tag or the main key of the object.
#### Example
##### JSON
[https://nominatim.openstreetmap.org/details?osmtype=W&osmid=32619803&entrances=1&format=json](https://nominatim.openstreetmap.org/details?osmtype=W&osmid=32619803&entrances=1&format=json)
```json
{
"place_id": 124325848,
"parent_place_id": 123936289,
"osm_type": "W",
"osm_id": 32619803,
"category": "shop",
"type": "supermarket",
"admin_level": 15,
"localname": "PENNY",
...
"entrances": [
{
"osm_id": 1733488238,
"type": "yes",
"lat": "51.0466704",
"lon": "12.8077106",
"extratags": {
"foot": "yes"
}
},
{
"osm_id": 1733488256,
"type": "main",
"lat": "51.0467197",
"lon": "12.8078448",
"extratags": {
"foot": "yes"
}
},
{
"osm_id": 1733498087,
"type": "exit",
"lat": "51.0467081",
"lon": "12.8078131",
"extratags": {
"foot": "yes"
}
},
{
"osm_id": 7914950851,
"type": "service",
"lat": "51.0468487",
"lon": "12.8075876",
"extratags": {
"access": "delivery"
}
}
]
}
```

View File

@@ -98,6 +98,12 @@ that is available in the database, e.g. wikipedia link, opening hours.
When set to 1, include a full list of names for the result. These may include
language variants, older names, references and brand.
| Parameter | Value | Default |
|-----------| ----- | ------- |
| entrances | 0 or 1 | 0 |
When set to 1, include the tagged entrances in the result.
### Language of results
@@ -146,7 +152,7 @@ In terms of address details the zoom levels are as follows:
| Parameter | Value | Default |
|-----------| ----- | ------- |
| layer | comma-separated list of: `address`, `poi`, `railway`, `natural`, `manmade` | _unset_ (no restriction) |
| layer | comma-separated list of: `address`, `poi`, `railway`, `natural`, `manmade` | `address,poi` |
The layer filter allows to select places by themes.
@@ -212,7 +218,7 @@ This overrides the specified machine readable format.
## Examples
* [https://nominatim.openstreetmap.org/reverse?format=xml&lat=52.5487429714954&lon=-1.81602098644987&zoom=18&addressdetails=1](https://nominatim.openstreetmap.org/reverse?format=xml&lat=52.5487429714954&lon=-1.81602098644987&zoom=18&addressdetails=1)
* [https://nominatim.openstreetmap.org/reverse?format=xml&lat=52.5487429714954&lon=-1.81602098644987&zoom=18&addressdetails=1&layer=address](https://nominatim.openstreetmap.org/reverse?format=xml&lat=52.5487429714954&lon=-1.81602098644987&zoom=18&addressdetails=1&layer=address)
```xml
<reversegeocode timestamp="Fri, 06 Nov 09 16:33:54 +0000" querystring="...">
@@ -235,7 +241,7 @@ This overrides the specified machine readable format.
##### Example with `format=jsonv2`
* [https://nominatim.openstreetmap.org/reverse?format=jsonv2&lat=-34.44076&lon=-58.70521](https://nominatim.openstreetmap.org/reverse?format=jsonv2&lat=-34.44076&lon=-58.70521)
* [https://nominatim.openstreetmap.org/reverse?format=jsonv2&lat=-34.44076&lon=-58.70521&layer=address](https://nominatim.openstreetmap.org/reverse?format=jsonv2&lat=-34.44076&lon=-58.70521&layer=address)
```json
{
@@ -267,7 +273,7 @@ This overrides the specified machine readable format.
##### Example with `format=geojson`
* [https://nominatim.openstreetmap.org/reverse?format=geojson&lat=44.50155&lon=11.33989](https://nominatim.openstreetmap.org/reverse?format=geojson&lat=44.50155&lon=11.33989)
* [https://nominatim.openstreetmap.org/reverse?format=geojson&lat=44.50155&lon=11.33989&layer=address](https://nominatim.openstreetmap.org/reverse?format=geojson&lat=44.50155&lon=11.33989&layer=address)
```json
{
@@ -319,7 +325,7 @@ This overrides the specified machine readable format.
##### Example with `format=geocodejson`
[https://nominatim.openstreetmap.org/reverse?format=geocodejson&lat=60.2299&lon=11.1663](https://nominatim.openstreetmap.org/reverse?format=geocodejson&lat=60.2299&lon=11.1663)
[https://nominatim.openstreetmap.org/reverse?format=geocodejson&lat=60.2299&lon=11.1663&layer=address](https://nominatim.openstreetmap.org/reverse?format=geocodejson&lat=60.2299&lon=11.1663&layer=address)
```json
{

View File

@@ -136,6 +136,12 @@ that is available in the database, e.g. wikipedia link, opening hours.
When set to 1, include a full list of names for the result. These may include
language variants, older names, references and brand.
| Parameter | Value | Default |
|-----------| ----- | ------- |
| entrances | 0 or 1 | 0 |
When set to 1, include the tagged entrances in the result.
### Language of results
@@ -212,7 +218,7 @@ other layers.
The featureType allows to have a more fine-grained selection for places
from the address layer. Results can be restricted to places that make up
the 'state', 'country' or 'city' part of an address. A featureType of
settlement selects any human inhabited feature from 'state' down to
`settlement` selects any human inhabited feature from 'state' down to
'neighbourhood'.
When featureType is set, then results are automatically restricted

View File

@@ -36,18 +36,27 @@ local flex = require('flex-base')
### Using preset configurations
If you want to start with one of the existing presets, then you can import
its settings using the `import_topic()` function:
its settings using the `load_topic()` function:
```
``` lua
local flex = require('flex-base')
flex.import_topic('streets')
flex.load_topic('streets')
```
The `import_topic` function takes an optional second configuration
The `load_topic` function takes an optional second configuration
parameter. The available options are explained in the
[themepark section](#using-osm2pgsql-themepark).
Available topics are: `admin`, `street`, `address`, `full`. These topic
correspond to the [import styles](../admin/Import.md#filtering-imported-data)
you can choose during import. To start with the 'extratags' style, use the
`full` topic with the appropriate config parameter:
``` lua
flex.load_topic('full', {with_extratags = true})
```
!!! note
You can also directly import the preset style files, e.g.
`local flex = require('import-street')`. It is not possible to
@@ -59,15 +68,16 @@ When Nominatim processes an OSM object, it looks for four kinds of tags:
The _main tags_ classify what kind of place the OSM object represents. One
OSM object can have more than one main tag. In such case one database entry
is created for each main tag. _Name tags_ represent searchable names of the
place. _Address tags_ are used to compute the address hierarchy of the place.
place. _Address tags_ are used to compute the address information of the place.
Address tags are used for searching and for creating a display name of the place.
_Extra tags_ are any tags that are not directly related to search but
contain interesting additional information.
contain interesting additional information. These are just saved in the database
and may be returned with the result [on request](../api/Search.md#output-details).
!!! danger
Some tags in the extratags category are used by Nominatim to better
classify the place. You want to make sure these are always present
in custom styles.
classify the place. These tags will always be added, independent of
any settings in the style.
Configuring the style means deciding which key and/or key/value is used
in which category.
@@ -103,6 +113,7 @@ The following classifications are recognized:
| named | Consider as main tag, when the object has a primary name (see [names](#name-tags) below) |
| named_with_key | Consider as main tag, when the object has a primary name with a domain prefix. For example, if the main tag is `bridge=yes`, then it will only be added as an extra entry, if there is a tag `bridge:name[:XXX]` for the same object. If this property is set, all names that are not domain-specific are ignored. |
| fallback | Consider as main tag only when no other main tag was found. Fallback always implies `named`, i.e. fallbacks are only tried for objects with primary names. |
| postcode_area | Tag indicates a postcode area. Copy area into the table of postcodes but only when the object is a relation and has a postcode tagged. |
| delete | Completely ignore the tag in any further processing |
| extra | Move the tag to extratags and then ignore it for further processing |
| `<function>`| Advanced handling, see [below](#advanced-main-tag-handling) |
@@ -116,8 +127,10 @@ value without key, then this is used as default for values that are not listed.
`set_main_tags()` will completely replace the current main tag configuration
with the new configuration. `modify_main_tags()` will merge the new
configuration with the existing one. Otherwise, the two functions do exactly
the same.
configuration with the existing one. Merging is done at value level.
For example, when the current setting is `highway = {'always', primary = 'named'}`,
then `set_main_tags{highway = 'delete'}` will result in a rule
`highway = {'delete', primary = 'named'}`.
!!! example
``` lua
@@ -134,9 +147,9 @@ the same.
when it has a value of `administrative`. Objects with `highway` tags are
always included with two exceptions: the troll tag `highway=no` is
deleted on the spot. And when the value is `street_lamp` then the object
must have a name, too. Finally, if a `landuse` tag is present then
it will be used independently of the concrete value when neither boundary
nor highway tags were found and the object is named.
must also have a name, to be included. Finally, if a `landuse` tag is
present then it will be used independently of the concrete value when
neither boundary nor highway tags were found and the object is named.
##### Presets
@@ -255,11 +268,7 @@ in turn take precedence over prefix matches.
##### Presets
| Name | Description |
| :----- | :---------- |
| required | Tags that Nominatim will use for various computations when present in extratags. Always include these. |
In addition, all [presets from ignored tags](#presets_1) are accepted.
Accepts all [presets from ignored tags](#presets_1).
### General pre-filtering
@@ -326,7 +335,7 @@ defined primary names are forgotten.)
| Name | Description |
| :----- | :---------- |
| core | Basic set of recogniced names for all places. |
| core | Basic set of recognized names for all places. |
| address | Additional names useful when indexing full addresses. |
| poi | Extended set of recognized names for pois. Use on top of the core set. |
@@ -415,6 +424,56 @@ is added for extratags.
already delete the tiger tags with `set_prefilters()` because that
would remove tiger:county before the address tags are processed.
## Filling additional tables
Most of the OSM objects are saved in the main `place` table for further
processing. In addition to that, there are some smaller tables that save
specialised information. The content of these tables can be customized as
well.
### Entrance table
The table `place_entrance` saves information about OSM nodes that represent
an entrance. This data is later mingled with buildings and other areas and
can be returned [on request](../api/Search.md#output-details). The table
saves the type of entrance as well as a set of custom extra tags.
The function `set_entrance_filter()` can be used to customize the table's
content.
When called without any parameter, then filling the entrance table will be
disabled. When called with a preset name, the appropriate preset will be
applied.
To create a custom configuration, call the function
with a table with the following fields:
* __main_tags__ is a list of tags that mark an entrance node. The value of the
first tag found in the list will be used as the entrance type.
* __extra_include__ is an optional list of tags to be added to the extratags
for this entrance. When left out, all tags except for the ones defined
in 'main_tags' will be included. To disable saving of extra tags, set
this to the empty list.
* __extra_exclude__ defines an optional list of tags to drop before including
the remaining tags as extratags. Note that the tags defined in 'main_tags'
will always be excluded, independently of this setting.
To have even more fine-grained control over the output, you can also hand
in a callback for processing entrance information. The callback function
receives a single parameter, the
[osm2pgsql object](https://osm2pgsql.org/doc/manual.html#processing-callbacks).
This object itself must not be modified. The callback should return either
`nil` when the object is not an entrance. Or it returns a table with a
mandatory `entrance` field containing a string with the type of entrance
and an optional `extratags` field with a simple key-value table of extra
information.
##### Presets
| Name | Description |
| :----- | :---------- |
| default | Standard configuration used with `full` and `extratags` styles. |
## Customizing osm2pgsql callbacks
osm2pgsql expects the flex style to implement three callbacks, one process
@@ -556,16 +615,6 @@ the Nominatim topic.
```
Discarding country-level boundaries when running under themepark.
## osm2pgsql gazetteer output
Nominatim still allows you to configure the gazetteer output to remain
backwards compatible with older imports. It will be automatically used
when the style file name ends in `.style`. For documentation of the
old import style, please refer to the documentation of older releases
of Nominatim. Do not use the gazetteer output for new imports. There is no
guarantee that new versions of Nominatim are fully compatible with the
gazetteer output.
## Changing the style of existing databases
There is usually no issue changing the style of a database that is already

View File

@@ -229,7 +229,7 @@ _None._
| Option | Description |
|-----------------|-------------|
| locales | [Locale](../library/Result-Handling.md#locale) object for the requested language(s) |
| locales | [Locales](../library/Result-Handling.md#locale) object for the requested language(s) |
| group_hierarchy | Setting of [group_hierarchy](../api/Details.md#output-details) parameter |
| icon_base_url | (optional) URL pointing to icons as set in [NOMINATIM_MAPICON_URL](Settings.md#nominatim_mapicon_url) |

View File

@@ -602,25 +602,44 @@ results gathered so far.
Note that under high load you may observe that users receive different results
than usual without seeing an error. This may cause some confusion.
### Logging Settings
#### NOMINATIM_LOG_DB
#### NOMINATIM_OUTPUT_NAMES
| Summary | |
| -------------- | --------------------------------------------------- |
| **Description:** | Log requests into the database |
| **Format:** | boolean |
| **Default:** | no |
| **After Changes:** | run `nominatim refresh --website` |
| **Description:** | Specifies order of name tags |
| **Format:** | string: comma-separated list of tag names |
| **Default:** | name:XX,name,brand,official_name:XX,short_name:XX,official_name,short_name,ref |
Enable logging requests into a database table with this setting. The logs
can be found in the table `new_query_log`.
Specifies the order in which different name tags are used.
The values in this list determine the preferred order of name variants,
including language-specific names (in OSM: the name tag with and without any language suffix).
When using this logging method, it is advisable to set up a job that
regularly clears out old logging information. Nominatim will not do that
on its own.
Comma-separated list, where :XX stands for language suffix
(e.g. name:en) and no :XX stands for general tags (e.g. name).
Can be used as the same time as NOMINATIM_LOG_FILE.
See also [NOMINATIM_DEFAULT_LANGUAGE](#nominatim_default_language).
!!! note
If NOMINATIM_OUTPUT_NAMES = `name:XX,name,short_name:XX,short_name` the search follows
```
'name', 'short_name'
```
if we have no preferred language order for showing search results.
For languages ['en', 'es'] the search follows
```
'name:en', 'name:es',
'name',
'short_name:en', 'short_name:es',
'short_name'
```
For those familiar with the internal implementation, the `_place_*` expansion is added, but to simplify, it is not included in this example.
### Logging Settings
#### NOMINATIM_LOG_FILE
@@ -629,23 +648,53 @@ Can be used as the same time as NOMINATIM_LOG_FILE.
| **Description:** | Log requests into a file |
| **Format:** | path |
| **Default:** | _empty_ (logging disabled) |
| **After Changes:** | run `nominatim refresh --website` |
Enable logging of requests into a file with this setting by setting the log
file where to log to. A relative file name is assumed to be relative to
the project directory.
the project directory. The format of the log output can be set
with NOMINATIM_LOG_FORMAT.
#### NOMINATIM_LOG_FORMAT
The entries in the log file have the following format:
| Summary | |
| -------------- | --------------------------------------------------- |
| **Description:** | Log requests into a file |
| **Format:** | [Python String Format](https://docs.python.org/3/library/string.html#formatstrings) string |
| **Default:** | `[{start}] {total_time:.4f} {results_total} {endpoint} "{query_string}"` |
<request time> <execution time in s> <number of results> <type> "<query string>"
Describes the content of a log line for a single request. The format
must be readable by Python's format function. Nominatim provides a number
of metrics than can be logged. The default set of metrics is the following:
Request time is the time when the request was started. The execution time is
given in seconds and includes the entire time the query was queued and executed
in the frontend.
type contains the name of the endpoint used.
/// html | div.simple-table
| name | type | Description |
| --------------- | ------ | ------------|
| start | time | Point in time when the request arrived. |
| end | time | Point in time when the request was done. |
| query_start | time | Point in time when processing started. |
| total_time | float | Total time in seconds to handle the request. |
| wait_time | float | Time in seconds the request waited for a database connection to be available. |
| query_time | float | Total time in seconds to process the request once a connection was available. |
| results_total | int | Number of results found. |
| endpoint | string | API endpoint used. |
| query_string | string | Raw query string received. |
///
Variables of type 'time' contain a UTC timestamp string in ISO format.
Nominatim also exposes additional metrics to help with development. These
are subject to change between versions:
/// html | div.simple-table
| name | type | Description |
| ------------------------- | ------ | ------------|
| search_rounds | int | Total number of searches executed for the request. |
| search_min_penalty | float | Minimal possible penalty for the request. |
| search_first_result_round | int | Number of first search to yield any result. |
| search_min_result_penalty | float | Minimal penalty by a result found. |
| search_best_penalty_round | int | Search round that yielded the best penalty result. |
///
Can be used as the same time as NOMINATIM_LOG_DB.
#### NOMINATIM_DEBUG_SQL

View File

@@ -50,7 +50,7 @@ queries. This happens in two stages:
as during the import process but may involve other processing like,
for example, word break detection.
2. The **token analysis** step breaks down the query parts into tokens,
looks them up in the database and assignes them possible functions and
looks them up in the database and assigns them possible functions and
probabilities.
Query processing can be further customized while the rest of the analysis
@@ -67,7 +67,13 @@ Here is an example configuration file:
``` yaml
query-preprocessing:
- normalize
- step: split_japanese_phrases
- step: regex_replace
replacements:
- pattern: https?://[^\s]* # Filter URLs starting with http or https
replace: ''
- step: normalize
normalization:
- ":: lower ()"
- "ß > 'ss'" # German szet is unambiguously equal to double ss
@@ -88,8 +94,8 @@ token-analysis:
replacements: ['ä', 'ae']
```
The configuration file contains four sections:
`normalization`, `transliteration`, `sanitizers` and `token-analysis`.
The configuration file contains five sections:
`query-preprocessing`, `normalization`, `transliteration`, `sanitizers` and `token-analysis`.
#### Query preprocessing
@@ -106,6 +112,19 @@ The following is a list of preprocessors that are shipped with Nominatim.
heading_level: 6
docstring_section_style: spacy
##### regex-replace
::: nominatim_api.query_preprocessing.regex_replace
options:
members: False
heading_level: 6
docstring_section_style: spacy
description:
This option runs any given regex pattern on the input and replaces values accordingly
replacements:
- pattern: regex pattern
replace: string to replace with
#### Normalization and Transliteration

View File

@@ -3,8 +3,7 @@
### Import tables
OSM data is initially imported using [osm2pgsql](https://osm2pgsql.org).
Nominatim uses its own data output style 'gazetteer', which differs from the
output style created for map rendering.
Nominatim uses a custom flex style to create the initial import tables.
The import process creates the following tables:
@@ -14,7 +13,7 @@ The `planet_osm_*` tables are the usual backing tables for OSM data. Note
that Nominatim uses them to look up special relations and to find nodes on
ways.
The gazetteer style produces a single table `place` as output with the following
The osm2pgsql import produces a single table `place` as output with the following
columns:
* `osm_type` - kind of OSM object (**N** - node, **W** - way, **R** - relation)
@@ -80,7 +79,7 @@ the placex table. Only three columns are special:
Address interpolations are always ways in OSM, which is why there is no column
`osm_type`.
The **location_postcode** table holds computed centroids of all postcodes that
The **location_postcodes** table holds computed centroids of all postcodes that
can be found in the OSM data. The meaning of the columns is again the same
as that of the placex table.

View File

@@ -25,15 +25,15 @@ following packages should get you started:
## Prerequisites for testing and documentation
The Nominatim test suite consists of behavioural tests (using behave) and
The Nominatim test suite consists of behavioural tests (using pytest-bdd) and
unit tests (using pytest). It has the following additional requirements:
* [behave test framework](https://behave.readthedocs.io) >= 1.2.6
* [flake8](https://flake8.pycqa.org/en/stable/) (CI always runs the latest version from pip)
* [mypy](http://mypy-lang.org/) (plus typing information for external libs)
* [Python Typing Extensions](https://github.com/python/typing_extensions) (for Python < 3.9)
* [pytest](https://pytest.org)
* [pytest-asyncio](https://pytest-asyncio.readthedocs.io)
* [pytest-bdd](https://pytest-bdd.readthedocs.io)
For testing the Python search frontend, you need to install extra dependencies
depending on your choice of webserver framework:
@@ -48,9 +48,6 @@ The documentation is built with mkdocs:
* [mkdocs-material](https://squidfunk.github.io/mkdocs-material/)
* [mkdocs-gen-files](https://oprypin.github.io/mkdocs-gen-files/)
Please be aware that tests always run against the globally installed
osm2pgsql, so you need to have this set up. If you want to test against
the vendored version of osm2pgsql, you need to set the PATH accordingly.
### Installing prerequisites on Ubuntu/Debian
@@ -69,13 +66,14 @@ To set up the virtual environment with all necessary packages run:
```sh
virtualenv ~/nominatim-dev-venv
~/nominatim-dev-venv/bin/pip install\
psutil psycopg[binary] PyICU SQLAlchemy \
python-dotenv jinja2 pyYAML datrie behave \
mkdocs mkdocstrings mkdocs-gen-files pytest pytest-asyncio flake8 \
psutil 'psycopg[binary]' PyICU SQLAlchemy \
python-dotenv jinja2 pyYAML \
mkdocs 'mkdocstrings[python]' mkdocs-gen-files \
pytest pytest-asyncio pytest-bdd flake8 \
types-jinja2 types-markupsafe types-psutil types-psycopg2 \
types-pygments types-pyyaml types-requests types-ujson \
types-urllib3 typing-extensions unicorn falcon starlette \
uvicorn mypy osmium aiosqlite
uvicorn mypy osmium aiosqlite mwparserfromhell
```
Now enter the virtual environment whenever you want to develop:
@@ -94,7 +92,7 @@ but executes against the code in the source tree. For example:
```
me@machine:~$ cd Nominatim
me@machine:~Nominatim$ ./nominatim-cli.py --version
Nominatim version 4.4.99-1
Nominatim version 5.1.0-0
```
Make sure you have activated the virtual environment holding all

View File

@@ -60,13 +60,19 @@ The order of phrases matters to Nominatim when doing further processing.
Thus, while you may split or join phrases, you should not reorder them
unless you really know what you are doing.
Phrase types (`nominatim_api.search.PhraseType`) can further help narrowing
down how the tokens in the phrase are interpreted. The following phrase types
are known:
Phrase types can further help narrowing down how the tokens in the phrase
are interpreted. The following phrase types are known:
::: nominatim_api.search.PhraseType
options:
heading_level: 6
| Name | Description |
|----------------|-------------|
| PHRASE_ANY | No specific designation (i.e. source is free-form query) |
| PHRASE_AMENITY | Contains name or type of a POI |
| PHRASE_STREET | Contains a street name optionally with a housenumber |
| PHRASE_CITY | Contains the postal city |
| PHRASE_COUNTY | Contains the equivalent of a county |
| PHRASE_STATE | Contains a state or province |
| PHRASE_POSTCODE| Contains a postal code |
| PHRASE_COUNTRY | Contains the country name or code |
## Custom sanitizer modules

View File

@@ -43,53 +43,62 @@ The name of the pytest binary depends on your installation.
## BDD Functional Tests (`test/bdd`)
Functional tests are written as BDD instructions. For more information on
the philosophy of BDD testing, see the
[Behave manual](http://pythonhosted.org/behave/philosophy.html).
The following explanation assume that the reader is familiar with the BDD
notations of features, scenarios and steps.
All possible steps can be found in the `steps` directory and should ideally
be documented.
the philosophy of BDD testing, read the Wikipedia article on
[Behaviour-driven development](https://en.wikipedia.org/wiki/Behavior-driven_development).
### General Usage
To run the functional tests, do
cd test/bdd
behave
pytest test/bdd
The tests can be configured with a set of environment variables (`behave -D key=val`):
You can run a single feature file using expression matching:
* `TEMPLATE_DB` - name of template database used as a skeleton for
the test databases (db tests)
* `TEST_DB` - name of test database (db tests)
* `API_TEST_DB` - name of the database containing the API test data (api tests)
* `API_TEST_FILE` - OSM file to be imported into the API test database (api tests)
* `API_ENGINE` - webframe to use for running search queries, same values as
`nominatim serve --engine` parameter
* `DB_HOST` - (optional) hostname of database host
* `DB_PORT` - (optional) port of database on host
* `DB_USER` - (optional) username of database login
* `DB_PASS` - (optional) password for database login
* `REMOVE_TEMPLATE` - if true, the template and API database will not be reused
during the next run. Reusing the base templates speeds
up tests considerably but might lead to outdated errors
for some changes in the database layout.
* `KEEP_TEST_DB` - if true, the test database will not be dropped after a test
is finished. Should only be used if one single scenario is
run, otherwise the result is undefined.
pytest test/bdd -k osm2pgsql/import/entrances.feature
This even works for running single tests by adding the line number of the
scenario header like that:
pytest test/bdd -k 'osm2pgsql/import/entrances.feature and L4'
The BDD tests create databases for the tests. You can set name of the databases
through configuration variables in your `pytest.ini`:
* `nominatim_test_db` defines the name of the temporary database created for
a single test (default: `test_nominatim`)
* `nominatim_api_test_db` defines the name of the database containing
the API test data, see also below (default: `test_api_nominatim`)
* `nominatim_template_db` defines the name of the template database used
for creating the temporary test databases. It contains some static setup
which usually doesn't change between imports of OSM data
(default: `test_template_nominatim`)
To change other connection parameters for the PostgreSQL database, use
the [libpq enivronment variables](https://www.postgresql.org/docs/current/libpq-envars.html).
Never set a password through these variables. Use a
[password file](https://www.postgresql.org/docs/current/libpq-pgpass.html) instead.
The API test database and the template database are only created once and then
left untouched. This is usually what you want because it speeds up subsequent
runs of BDD tests. If you do change code that has an influence on the content
of these databases, you can run pytest with the `--nominatim-purge` parameter
and the databases will be dropped and recreated from scratch.
When running the BDD tests with make (using `make tests` or `make bdd`), then
the databases will always be purged.
The temporary test database is usually dropped directly after the test, so
it does not take up unnecessary space. If you want to keep the database around,
for example while debugging a specific BDD test, use the parameter
`--nominatim-keep-db`.
Logging can be defined through command line parameters of behave itself. Check
out `behave --help` for details. Also have a look at the 'work-in-progress'
feature of behave which comes in handy when writing new tests.
### API Tests (`test/bdd/api`)
These tests are meant to test the different API endpoints and their parameters.
They require to import several datasets into a test database. This is normally
done automatically during setup of the test. The API test database is then
kept around and reused in subsequent runs of behave. Use `behave -DREMOVE_TEMPLATE`
kept around and reused in subsequent runs of behave. Use `--nominatim-purge`
to force a reimport of the database.
The official test dataset is saved in the file `test/testdb/apidb-test-data.pbf`
@@ -109,12 +118,12 @@ test the correctness of osm2pgsql. Each test will write some data into the `plac
table (and optionally the `planet_osm_*` tables if required) and then run
Nominatim's processing functions on that.
These tests need to create their own test databases. By default they will be
called `test_template_nominatim` and `test_nominatim`. Names can be changed with
the environment variables `TEMPLATE_DB` and `TEST_DB`. The user running the tests
needs superuser rights for postgres.
These tests use the template database and create temporary test databases for
each test.
### Import Tests (`test/bdd/osm2pgsql`)
These tests check that data is imported correctly into the place table. They
use the same template database as the DB Creation tests, so the same remarks apply.
These tests check that data is imported correctly into the place table.
These tests also use the template database and create temporary test databases
for each test.

View File

@@ -9,7 +9,7 @@ the address computation and the search frontend.
The __data import__ stage reads the raw OSM data and extracts all information
that is useful for geocoding. This part is done by osm2pgsql, the same tool
that can also be used to import a rendering database. It uses the special
gazetteer output plugin in `osm2pgsql/src/output-gazetter.[ch]pp`. The result of
flex output style defined in the directory `/lib-lua`. The result of
the import can be found in the database table `place`.
The __address computation__ or __indexing__ stage takes the data from `place`

View File

@@ -74,15 +74,16 @@ map place_addressline {
isaddress => BOOLEAN
}
map location_postcode {
map location_postcodes {
place_id => BIGINT
osm_id => BIGINT
postcode => TEXT
parent_place_id => BIGINT
rank_search => SMALLINT
rank_address => SMALLINT
indexed_status => SMALLINT
indexed_date => TIMESTAMP
geometry => GEOMETRY
centroid -> GEOMETRY
}
placex::place_id <-- search_name::place_id
@@ -94,6 +95,6 @@ search_name::nameaddress_vector --> word::word_id
place_addressline -[hidden]> location_property_osmline
search_name -[hidden]> place_addressline
location_property_osmline -[hidden]-> location_postcode
location_property_osmline -[hidden]-> location_postcodes
@enduml

View File

@@ -39,3 +39,9 @@ th {
filter: grayscale(100%);
font-size: 80%;
}
.simple-table table:not([class]) th,
.simple-table table:not([class]) td {
padding: 2px 4px;
background: white;
}

View File

@@ -248,19 +248,19 @@ of the result. To do that, you first need to decide in which language the
results should be presented. As with the names in the result itself, the
places in `address_rows` contain all possible name translation for each row.
The library has a helper class `Locale` which helps extracting a name of a
The library has a helper class `Locales` which helps extracting a name of a
place in the preferred language. It takes a single parameter with a list
of language codes in the order of preference. So
``` python
locale = napi.Locale(['fr', 'en'])
locale = napi.Locales(['fr', 'en'])
```
creates a helper class that returns the name preferably in French. If that is
not possible, it tries English and eventually falls back to the default `name`
or `ref`.
The `Locale` object can be applied to a name dictionary to return the best-matching
The `Locales` object can be applied to a name dictionary to return the best-matching
name out of it:
``` python
@@ -268,13 +268,17 @@ name out of it:
'Brugges'
```
The `address_row` field has a helper function to apply the function to all
its members and save the result in the `local_name` field. It also returns
all the localized names as a convenient simple list. This list can be used
to create a human-readable output:
The `address_row` field has a helper function to compute the display name for each Address Line
component based on its `local_name` field. This is then utilized by the overall `result` object,
which has a helper function to apply the function to all its address_row members and saves
the result in the `locale_name` field.
However, in order to set this `local_name` field in a preferred language, you must use the `Locales`
object which contains the function `localize_results`, which explicitly sets each `local_name field`.
``` python
>>> address_parts = results[0].address_rows.localize(locale)
>>> Locales().localize_results(results)
>>> address_parts = results[0].address_rows
>>> print(', '.join(address_parts))
Bruges, Flandre-Occidentale, Flandre, Belgique
```

View File

@@ -49,7 +49,11 @@ its address.
## Localization
Results are always returned with the full list of available names.
Results are always returned with the full list of available names. However, the
default `locale_name` must be explicitly set using the `localize` function within
`Locales`. This parses through the full list of available names to find the one
most preferred by the user. Once this is set, the user can simply use the
`display_name` field within a `Result` object to retrive the localized name.
### Locale

View File

@@ -29,7 +29,9 @@ local NAME_FILTER = nil
local ADDRESS_TAGS = {}
local ADDRESS_FILTER = nil
local EXTRATAGS_FILTER
local REQUIRED_EXTRATAGS_FILTER
local POSTCODE_FALLBACK = true
local ENTRANCE_FUNCTION = nil
-- This file can also be directly require'd instead of running it under
-- the themepark framework. In that case the first parameter is usually
@@ -40,37 +42,63 @@ if type(themepark) ~= 'table' then
themepark = nil
end
-- The single place table.
local place_table_definition = {
name = "place",
ids = { type = 'any', id_column = 'osm_id', type_column = 'osm_type' },
columns = {
{ column = 'class', type = 'text', not_null = true },
{ column = 'type', type = 'text', not_null = true },
{ column = 'admin_level', type = 'smallint' },
{ column = 'name', type = 'hstore' },
{ column = 'address', type = 'hstore' },
{ column = 'extratags', type = 'hstore' },
{ column = 'geometry', type = 'geometry', projection = 'WGS84', not_null = true },
-- The place tables carry the raw OSM information.
local table_definitions = {
place = {
ids = { type = 'any', id_column = 'osm_id', type_column = 'osm_type' },
columns = {
{ column = 'class', type = 'text', not_null = true },
{ column = 'type', type = 'text', not_null = true },
{ column = 'admin_level', type = 'smallint' },
{ column = 'name', type = 'hstore' },
{ column = 'address', type = 'hstore' },
{ column = 'extratags', type = 'hstore' },
{ column = 'geometry', type = 'geometry', projection = 'WGS84', not_null = true },
},
indexes = {}
},
data_tablespace = os.getenv("NOMINATIM_TABLESPACE_PLACE_DATA"),
index_tablespace = os.getenv("NOMINATIM_TABLESPACE_PLACE_INDEX"),
indexes = {}
place_entrance = {
ids = { type = 'node', id_column = 'osm_id' },
columns = {
{ column = 'type', type = 'text', not_null = true },
{ column = 'extratags', type = 'hstore' },
{ column = 'geometry', type = 'geometry', projection = 'WGS84', not_null = true }
},
indexes = {}
},
place_postcode = {
ids = { type = 'any', id_column = 'osm_id', type_column = 'osm_type' },
columns = {
{ column = 'postcode', type = 'text', not_null = true },
{ column = 'country_code', type = 'text' },
{ column = 'centroid', type = 'point', projection = 'WGS84', not_null = true },
{ column = 'geometry', type = 'geometry', projection = 'WGS84' }
},
indexes = {
{ column = 'postcode', method = 'btree' }
}
}
}
local insert_row
local insert_row = {}
local script_path = debug.getinfo(1, "S").source:match("@?(.*/)")
local PRESETS = loadfile(script_path .. 'presets.lua')()
if themepark then
themepark:add_table(place_table_definition)
insert_row = function(columns)
themepark:insert('place', columns, {}, {})
end
else
local place_table = osm2pgsql.define_table(place_table_definition)
insert_row = function(columns)
place_table:insert(columns)
for table_name, table_definition in pairs(table_definitions) do
table_definition.name = table_name
table_definition.data_tablespace = os.getenv("NOMINATIM_TABLESPACE_PLACE_DATA")
table_definition.index_tablespace = os.getenv("NOMINATIM_TABLESPACE_PLACE_INDEX")
if themepark then
themepark:add_table(table_definition)
insert_row[table_name] = function(columns)
themepark:insert(table_name, columns, {}, {})
end
else
local place_table = osm2pgsql.define_table(table_definition)
insert_row[table_name] = function(columns)
place_table:insert(columns)
end
end
end
@@ -97,6 +125,7 @@ local PlaceTransform = {}
-- Special transform meanings which are interpreted elsewhere
PlaceTransform.fallback = 'fallback'
PlaceTransform.postcode_area = 'postcode_area'
PlaceTransform.delete = 'delete'
PlaceTransform.extra = 'extra'
@@ -150,24 +179,6 @@ local function address_fallback(place)
return place:clone{names=names}
end
--------- Built-in extratags transformation functions ---------------
local function default_extratags_filter(p, k)
-- Default handling is to copy over place tag for boundaries.
-- Nominatim needs this.
if k ~= 'boundary' or p.intags.place == nil then
return p.extratags
end
local extra = { place = p.intags.place }
for kin, vin in pairs(p.extratags) do
extra[kin] = vin
end
return extra
end
EXTRATAGS_FILTER = default_extratags_filter
----------------- other helper functions -----------------------------
local function lookup_prefilter_classification(k, v)
@@ -421,26 +432,47 @@ function Place:write_place(k, v, mfunc)
return 0
end
function Place:write_row(k, v)
function Place:geometry_is_valid()
if self.geometry == nil then
self.geometry = self.geom_func(self.object)
if self.geometry == nil or self.geometry:is_null() then
self.geometry = false
return false
end
return true
end
if self.geometry:is_null() then
return self.geometry ~= false
end
function Place:write_row(k, v)
if not self:geometry_is_valid() then
return 0
end
local extratags = EXTRATAGS_FILTER(self, k, v)
if not (extratags and next(extratags)) then
extratags = nil
end
local extra = EXTRATAGS_FILTER(self, k, v) or {}
insert_row{
for tk, tv in pairs(self.object.tags) do
if REQUIRED_EXTRATAGS_FILTER(tk, tv) and extra[tk] == nil then
extra[tk] = tv
end
end
if extra and next(extra) == nil then
extra = nil
end
insert_row.place{
class = k,
type = v,
admin_level = self.admin_level,
name = next(self.names) and self.names,
address = next(self.address) and self.address,
extratags = extratags,
extratags = extra,
geometry = self.geometry
}
@@ -593,6 +625,16 @@ end
-- Process functions for all data types
function module.process_node(object)
if ENTRANCE_FUNCTION ~= nil then
local entrance_info = ENTRANCE_FUNCTION(object)
if entrance_info ~= nil then
insert_row.place_entrance{
type = entrance_info.entrance,
extratags = entrance_info.extratags,
geometry = object:as_point()
}
end
end
local function geom_func(o)
return o:as_point()
@@ -608,6 +650,9 @@ function module.process_way(object)
if geom:is_null() then
geom = o:as_linestring()
if geom:is_null() or geom:length() > 30 then
return nil
end
end
return geom
@@ -657,9 +702,6 @@ function module.process_tags(o)
if o.address.country ~= nil and #o.address.country ~= 2 then
o.address['country'] = nil
end
if POSTCODE_FALLBACK and fallback == nil and o.address.postcode ~= nil then
fallback = {'place', 'postcode', PlaceTransform.always}
end
if o.address.interpolation ~= nil then
o:write_place('place', 'houses', PlaceTransform.always)
@@ -667,23 +709,53 @@ function module.process_tags(o)
end
-- collect main keys
local postcode_collect = false
for k, v in pairs(o.intags) do
local ktable = MAIN_KEYS[k]
if ktable then
local ktype = ktable[v] or ktable[1]
if type(ktype) == 'function' then
o:write_place(k, v, ktype)
elseif ktype == 'postcode_area' then
postcode_collect = true
if o.object.type == 'relation'
and o.address.postcode ~= nil
and o:geometry_is_valid() then
insert_row.place_postcode{
postcode = o.address.postcode,
centroid = o.geometry:centroid(),
geometry = o.geometry
}
end
elseif ktype == 'fallback' and o.has_name then
fallback = {k, v, PlaceTransform.named}
end
end
end
if fallback ~= nil and o.num_entries == 0 then
o:write_place(fallback[1], fallback[2], fallback[3])
if o.num_entries == 0 then
if fallback ~= nil then
o:write_place(fallback[1], fallback[2], fallback[3])
elseif POSTCODE_FALLBACK and not postcode_collect
and o.address.postcode ~= nil
and o:geometry_is_valid() then
insert_row.place_postcode{
postcode = o.address.postcode,
centroid = o.geometry:centroid()
}
end
end
end
--------- Extratags post-processing functions ---------------
local function default_extratags_filter(p, k)
return p.extratags
end
EXTRATAGS_FILTER = default_extratags_filter
REQUIRED_EXTRATAGS_FILTER = module.tag_match(PRESETS.EXTRATAGS)
--------- Convenience functions for simple style configuration -----------------
function module.set_prefilters(data)
@@ -714,7 +786,7 @@ end
function module.add_for_extratags(data)
if type(data) == 'string' then
local preset = data
data = PRESETS.EXTRATAGS[data] or PRESETS.IGNORE_KEYS[data]
data = PRESETS.IGNORE_KEYS[data]
if data == nil then
error('Unknown preset for extratags: ' .. preset)
end
@@ -914,6 +986,99 @@ function module.set_relation_types(data)
end
end
function module.set_entrance_filter(data)
if data == nil or type(data) == 'function' then
ENTRANCE_FUNCTION = data
return nil
end
if type(data) == 'string' then
local preset = data
data = PRESETS.ENTRANCE_TABLE[data]
if data == nil then
error('Unknown preset for entrance table: ' .. preset)
end
end
ENTRANCE_FUNCTION = nil
if data.main_tags ~= nil and next(data.main_tags) ~= nil then
if data.extra_include ~= nil and next(data.extra_include) == nil then
-- shortcut: no extra tags requested
ENTRANCE_FUNCTION = function(o)
for _, v in ipairs(data.main_tags) do
if o.tags[v] ~= nil then
return {entrance = o.tags[v]}
end
end
return nil
end
else
if data.extra_include ~= nil then
local tags = {}
for _, v in pairs(data.extra_include) do
tags[v] = true
end
if data.extra_exclude ~= nil then
for _, v in pairs(data.extra_exclude) do
tags[v] = nil
end
end
for _, v in pairs(data.main_tags) do
tags[v] = nil
end
ENTRANCE_FUNCTION = function(o)
for _, v in ipairs(data.main_tags) do
if o.tags[v] ~= nil then
local entrance = o.tags[v]
local extra = {}
for k, v in pairs(tags) do
extra[k] = o.tags[k]
end
if next(extra) == nil then
extra = nil
end
return {entrance = entrance, extratags = extra}
end
end
return nil
end
else
local notags = {}
if data.extra_exclude ~= nil then
for _, v in pairs(data.extra_exclude) do
notags[v] = 1
end
end
for _, v in pairs(data.main_tags) do
notags[v] = 1
end
ENTRANCE_FUNCTION = function(o)
for _, v in ipairs(data.main_tags) do
if o.tags[v] ~= nil then
local entrance = o.tags[v]
local extra = {}
for k, v in pairs(o.tags) do
if notags[k] ~= 1 then
extra[k] = v
end
end
if next(extra) == nil then
extra = nil
end
return {entrance = entrance, extratags = extra}
end
end
return nil
end
end
end
end
end
function module.get_taginfo()
return {main = MAIN_KEYS, name = NAMES, address = ADDRESS_TAGS}

View File

@@ -117,7 +117,8 @@ module.MAIN_TAGS.all_boundaries = {
boundary = {'named',
place = 'delete',
land_area = 'delete',
postal_code = 'always'},
protected_area = 'fallback',
postal_code = 'postcode_area'},
landuse = 'fallback',
place = 'always'
}
@@ -187,7 +188,7 @@ module.MAIN_TAGS_POIS = function (group)
passing_place = group,
street_lamp = 'named',
traffic_signals = 'named'},
historic = {'always',
historic = {'fallback',
yes = group,
no = group},
information = {include_when_tag_present('tourism', 'information'),
@@ -196,9 +197,12 @@ module.MAIN_TAGS_POIS = function (group)
trail_blaze = 'never'},
junction = {'fallback',
no = group},
landuse = {cemetery = 'always'},
leisure = {'always',
nature_reserve = 'fallback',
nature_reserve = 'named',
swimming_pool = 'named',
garden = 'named',
common = 'named',
no = group},
lock = {yes = lock_transform},
man_made = {pier = 'always',
@@ -229,6 +233,7 @@ module.MAIN_TAGS_POIS = function (group)
shop = {'always',
no = group},
tourism = {'always',
attraction = 'fallback',
no = group,
yes = group,
information = exclude_when_key_present('information')},
@@ -317,7 +322,6 @@ module.NAME_TAGS = {}
module.NAME_TAGS.core = {main = {'name', 'name:*',
'int_name', 'int_name:*',
'nat_name', 'nat_name:*',
'reg_name', 'reg_name:*',
'loc_name', 'loc_name:*',
'old_name', 'old_name:*',
@@ -330,7 +334,7 @@ module.NAME_TAGS.core = {main = {'name', 'name:*',
}
module.NAME_TAGS.address = {house = {'addr:housename'}}
module.NAME_TAGS.poi = group_merge({main = {'brand'},
extra = {'iata', 'icao'}},
extra = {'iata', 'icao', 'faa'}},
module.NAME_TAGS.core)
-- Address tagging
@@ -360,7 +364,7 @@ module.IGNORE_KEYS.metatags = {'note', 'note:*', 'source', 'source:*', '*source'
'tiger:cfcc', 'tiger:reviewed', 'nysgissam:*',
'NHD:*', 'nhd:*', 'gnis:*', 'geobase:*', 'yh:*',
'osak:*', 'naptan:*', 'CLC:*', 'import', 'it:fvg:*',
'lacounty:*', 'ref:linz:*',
'lacounty:*', 'ref:linz:*', 'survey:*',
'ref:bygningsnr', 'ref:ruian:*', 'building:ruian:type',
'type',
'is_in:postcode'}
@@ -371,10 +375,15 @@ module.IGNORE_KEYS.address = {'addr:street:*', 'addr:city:*', 'addr:district:*',
'addr:province:*', 'addr:subdistrict:*', 'addr:place:*',
'addr:TW:dataset'}
-- Extra tags (prefiltered away)
-- INTERNAL: Required extra tags
module.EXTRATAGS = {}
module.EXTRATAGS = {keys = {'wikipedia', 'wikipedia:*', 'wikidata', 'capital'}}
module.EXTRATAGS.required = {'wikipedia', 'wikipedia:*', 'wikidata', 'capital'}
-- Defaults for the entrance table
module.ENTRANCE_TABLE = {}
module.ENTRANCE_TABLE.default = {main_tags = {'entrance', 'routing:entrance'},
extra_exclude = module.IGNORE_KEYS.metatags}
return module

View File

@@ -11,7 +11,6 @@ flex.set_address_tags('core')
flex.modify_address_tags('houses')
flex.ignore_keys('metatags')
flex.add_for_extratags('required')
if cfg.with_extratags then
flex.set_unused_handling{delete_keys = {'tiger:*'}}

View File

@@ -8,7 +8,6 @@ flex.set_address_tags('core')
flex.set_postcode_fallback(false)
flex.ignore_keys('metatags')
flex.add_for_extratags('required')
if cfg.with_extratags then
flex.set_unused_handling{delete_keys = {'tiger:*'}}

View File

@@ -20,7 +20,6 @@ flex.set_address_tags('core')
flex.modify_address_tags('houses')
flex.ignore_keys('metatags')
flex.add_for_extratags('required')
if cfg.with_extratags then
flex.set_unused_handling{delete_keys = {'tiger:*'}}
@@ -30,3 +29,5 @@ else
flex.ignore_keys('name')
flex.ignore_keys('address')
end
flex.set_entrance_filter('default')

View File

@@ -10,7 +10,6 @@ flex.set_address_tags('core')
flex.set_postcode_fallback(false)
flex.ignore_keys('metatags')
flex.add_for_extratags('required')
if cfg.with_extratags then
flex.set_unused_handling{delete_keys = {'tiger:*'}}

View File

@@ -2,13 +2,12 @@
--
-- This file is part of Nominatim. (https://nominatim.org)
--
-- Copyright (C) 2022 by the Nominatim developer community.
-- Copyright (C) 2025 by the Nominatim developer community.
-- For a full list of authors see the git log.
{% include('functions/utils.sql') %}
{% include('functions/ranking.sql') %}
{% include('functions/importance.sql') %}
{% include('functions/address_lookup.sql') %}
{% include('functions/interpolation.sql') %}
{% if 'place' in db.tables %}
@@ -19,7 +18,7 @@
{% include 'functions/placex_triggers.sql' %}
{% endif %}
{% if 'location_postcode' in db.tables %}
{% if 'location_postcodes' in db.tables %}
{% include 'functions/postcode_triggers.sql' %}
{% endif %}

View File

@@ -1,334 +0,0 @@
-- SPDX-License-Identifier: GPL-2.0-only
--
-- This file is part of Nominatim. (https://nominatim.org)
--
-- Copyright (C) 2022 by the Nominatim developer community.
-- For a full list of authors see the git log.
-- Functions for returning address information for a place.
DROP TYPE IF EXISTS addressline CASCADE;
CREATE TYPE addressline as (
place_id BIGINT,
osm_type CHAR(1),
osm_id BIGINT,
name HSTORE,
class TEXT,
type TEXT,
place_type TEXT,
admin_level INTEGER,
fromarea BOOLEAN,
isaddress BOOLEAN,
rank_address INTEGER,
distance FLOAT
);
CREATE OR REPLACE FUNCTION get_name_by_language(name hstore, languagepref TEXT[])
RETURNS TEXT
AS $$
DECLARE
result TEXT;
BEGIN
IF name is null THEN
RETURN null;
END IF;
FOR j IN 1..array_upper(languagepref,1) LOOP
IF name ? languagepref[j] THEN
result := trim(name->languagepref[j]);
IF result != '' THEN
return result;
END IF;
END IF;
END LOOP;
-- as a fallback - take the last element since it is the default name
RETURN trim((avals(name))[array_length(avals(name), 1)]);
END;
$$
LANGUAGE plpgsql IMMUTABLE;
--housenumber only needed for tiger data
CREATE OR REPLACE FUNCTION get_address_by_language(for_place_id BIGINT,
housenumber INTEGER,
languagepref TEXT[])
RETURNS TEXT
AS $$
DECLARE
result TEXT[];
currresult TEXT;
prevresult TEXT;
location RECORD;
BEGIN
result := '{}';
prevresult := '';
FOR location IN
SELECT name,
CASE WHEN place_id = for_place_id THEN 99 ELSE rank_address END as rank_address
FROM get_addressdata(for_place_id, housenumber)
WHERE isaddress order by rank_address desc
LOOP
currresult := trim(get_name_by_language(location.name, languagepref));
IF currresult != prevresult AND currresult IS NOT NULL
AND result[(100 - location.rank_address)] IS NULL
THEN
result[(100 - location.rank_address)] := currresult;
prevresult := currresult;
END IF;
END LOOP;
RETURN array_to_string(result,', ');
END;
$$
LANGUAGE plpgsql STABLE;
DROP TYPE IF EXISTS addressdata_place;
CREATE TYPE addressdata_place AS (
place_id BIGINT,
country_code VARCHAR(2),
housenumber TEXT,
postcode TEXT,
class TEXT,
type TEXT,
name HSTORE,
address HSTORE,
centroid GEOMETRY
);
-- Compute the list of address parts for the given place.
--
-- If in_housenumber is greator or equal 0, look for an interpolation.
CREATE OR REPLACE FUNCTION get_addressdata(in_place_id BIGINT, in_housenumber INTEGER)
RETURNS setof addressline
AS $$
DECLARE
place addressdata_place;
location RECORD;
country RECORD;
current_rank_address INTEGER;
location_isaddress BOOLEAN;
BEGIN
-- The place in question might not have a direct entry in place_addressline.
-- Look for the parent of such places then and save it in place.
-- first query osmline (interpolation lines)
IF in_housenumber >= 0 THEN
SELECT parent_place_id as place_id, country_code,
in_housenumber as housenumber, postcode,
'place' as class, 'house' as type,
null as name, null as address,
ST_Centroid(linegeo) as centroid
INTO place
FROM location_property_osmline
WHERE place_id = in_place_id
AND in_housenumber between startnumber and endnumber;
END IF;
--then query tiger data
{% if config.get_bool('USE_US_TIGER_DATA') %}
IF place IS NULL AND in_housenumber >= 0 THEN
SELECT parent_place_id as place_id, 'us' as country_code,
in_housenumber as housenumber, postcode,
'place' as class, 'house' as type,
null as name, null as address,
ST_Centroid(linegeo) as centroid
INTO place
FROM location_property_tiger
WHERE place_id = in_place_id
AND in_housenumber between startnumber and endnumber;
END IF;
{% endif %}
-- postcode table
IF place IS NULL THEN
SELECT parent_place_id as place_id, country_code,
null::text as housenumber, postcode,
'place' as class, 'postcode' as type,
null as name, null as address,
null as centroid
INTO place
FROM location_postcode
WHERE place_id = in_place_id;
END IF;
-- POI objects in the placex table
IF place IS NULL THEN
SELECT parent_place_id as place_id, country_code,
coalesce(address->'housenumber',
address->'streetnumber',
address->'conscriptionnumber')::text as housenumber,
postcode,
class, type,
name, address,
centroid
INTO place
FROM placex
WHERE place_id = in_place_id and rank_search > 27;
END IF;
-- If place is still NULL at this point then the object has its own
-- entry in place_address line. However, still check if there is not linked
-- place we should be using instead.
IF place IS NULL THEN
select coalesce(linked_place_id, place_id) as place_id, country_code,
null::text as housenumber, postcode,
class, type,
null as name, address,
null as centroid
INTO place
FROM placex where place_id = in_place_id;
END IF;
--RAISE WARNING '% % % %',searchcountrycode, searchhousenumber, searchpostcode;
-- --- Return the record for the base entry.
current_rank_address := 1000;
FOR location IN
SELECT placex.place_id, osm_type, osm_id, name,
coalesce(extratags->'linked_place', extratags->'place') as place_type,
class, type, admin_level,
CASE WHEN rank_address = 0 THEN 100
WHEN rank_address = 11 THEN 5
ELSE rank_address END as rank_address,
country_code
FROM placex
WHERE place_id = place.place_id
LOOP
--RAISE WARNING '%',location;
-- mix in default names for countries
IF location.rank_address = 4 and place.country_code is not NULL THEN
FOR country IN
SELECT coalesce(name, ''::hstore) as name FROM country_name
WHERE country_code = place.country_code LIMIT 1
LOOP
place.name := country.name || place.name;
END LOOP;
END IF;
IF location.rank_address < 4 THEN
-- no country locations for ranks higher than country
place.country_code := NULL::varchar(2);
ELSEIF place.country_code IS NULL AND location.country_code IS NOT NULL THEN
place.country_code := location.country_code;
END IF;
RETURN NEXT ROW(location.place_id, location.osm_type, location.osm_id,
location.name, location.class, location.type,
location.place_type,
location.admin_level, true,
location.type not in ('postcode', 'postal_code'),
location.rank_address, 0)::addressline;
current_rank_address := location.rank_address;
END LOOP;
-- --- Return records for address parts.
FOR location IN
SELECT placex.place_id, osm_type, osm_id, name, class, type,
coalesce(extratags->'linked_place', extratags->'place') as place_type,
admin_level, fromarea, isaddress,
CASE WHEN rank_address = 11 THEN 5 ELSE rank_address END as rank_address,
distance, country_code, postcode
FROM place_addressline join placex on (address_place_id = placex.place_id)
WHERE place_addressline.place_id IN (place.place_id, in_place_id)
AND linked_place_id is null
AND (placex.country_code IS NULL OR place.country_code IS NULL
OR placex.country_code = place.country_code)
ORDER BY rank_address desc,
(place_addressline.place_id = in_place_id) desc,
(CASE WHEN coalesce((avals(name) && avals(place.address)), False) THEN 2
WHEN isaddress THEN 0
WHEN fromarea
and place.centroid is not null
and ST_Contains(geometry, place.centroid) THEN 1
ELSE -1 END) desc,
fromarea desc, distance asc, rank_search desc
LOOP
-- RAISE WARNING '%',location;
location_isaddress := location.rank_address != current_rank_address;
IF place.country_code IS NULL AND location.country_code IS NOT NULL THEN
place.country_code := location.country_code;
END IF;
IF location.type in ('postcode', 'postal_code')
AND place.postcode is not null
THEN
-- If the place had a postcode assigned, take this one only
-- into consideration when it is an area and the place does not have
-- a postcode itself.
IF location.fromarea AND location_isaddress
AND (place.address is null or not place.address ? 'postcode')
THEN
place.postcode := null; -- remove the less exact postcode
ELSE
location_isaddress := false;
END IF;
END IF;
RETURN NEXT ROW(location.place_id, location.osm_type, location.osm_id,
location.name, location.class, location.type,
location.place_type,
location.admin_level, location.fromarea,
location_isaddress,
location.rank_address,
location.distance)::addressline;
current_rank_address := location.rank_address;
END LOOP;
-- If no country was included yet, add the name information from country_name.
IF current_rank_address > 4 THEN
FOR location IN
SELECT name || coalesce(derived_name, ''::hstore) as name FROM country_name
WHERE country_code = place.country_code LIMIT 1
LOOP
--RAISE WARNING '% % %',current_rank_address,searchcountrycode,countryname;
RETURN NEXT ROW(null, null, null, location.name, 'place', 'country', NULL,
null, true, true, 4, 0)::addressline;
END LOOP;
END IF;
-- Finally add some artificial rows.
IF place.country_code IS NOT NULL THEN
location := ROW(null, null, null, hstore('ref', place.country_code),
'place', 'country_code', null, null, true, false, 4, 0)::addressline;
RETURN NEXT location;
END IF;
IF place.name IS NOT NULL THEN
location := ROW(in_place_id, null, null, place.name, place.class,
place.type, null, null, true, true, 29, 0)::addressline;
RETURN NEXT location;
END IF;
IF place.housenumber IS NOT NULL THEN
location := ROW(null, null, null, hstore('ref', place.housenumber),
'place', 'house_number', null, null, true, true, 28, 0)::addressline;
RETURN NEXT location;
END IF;
IF place.address is not null and place.address ? '_unlisted_place' THEN
RETURN NEXT ROW(null, null, null, hstore('name', place.address->'_unlisted_place'),
'place', 'locality', null, null, true, true, 25, 0)::addressline;
END IF;
IF place.postcode is not null THEN
location := ROW(null, null, null, hstore('ref', place.postcode), 'place',
'postcode', null, null, false, true, 5, 0)::addressline;
RETURN NEXT location;
ELSEIF place.address is not null and place.address ? 'postcode'
and not place.address->'postcode' SIMILAR TO '%(,|;)%' THEN
location := ROW(null, null, null, hstore('ref', place.address->'postcode'), 'place',
'postcode', null, null, false, true, 5, 0)::addressline;
RETURN NEXT location;
END IF;
RETURN;
END;
$$
LANGUAGE plpgsql STABLE;

View File

@@ -2,7 +2,7 @@
--
-- This file is part of Nominatim. (https://nominatim.org)
--
-- Copyright (C) 2022 by the Nominatim developer community.
-- Copyright (C) 2026 by the Nominatim developer community.
-- For a full list of authors see the git log.
-- Functions for interpreting wkipedia/wikidata tags and computing importance.
@@ -65,7 +65,7 @@ BEGIN
RETURN NULL;
END;
$$
LANGUAGE plpgsql IMMUTABLE;
LANGUAGE plpgsql IMMUTABLE PARALLEL SAFE;
{% else %}
@@ -78,7 +78,7 @@ SELECT convert_from(CAST(E'\\x' || array_to_string(ARRAY(
FROM regexp_matches($1, '%[0-9a-f][0-9a-f]|.', 'gi') AS r(m)
), '') AS bytea), 'UTF8');
$$
LANGUAGE SQL IMMUTABLE STRICT;
LANGUAGE SQL IMMUTABLE STRICT PARALLEL SAFE;
CREATE OR REPLACE FUNCTION catch_decode_url_part(p varchar)
@@ -91,7 +91,7 @@ EXCEPTION
WHEN others THEN return null;
END;
$$
LANGUAGE plpgsql IMMUTABLE STRICT;
LANGUAGE plpgsql IMMUTABLE STRICT PARALLEL SAFE;
CREATE OR REPLACE FUNCTION get_wikipedia_match(extratags HSTORE, country_code varchar(2))
@@ -139,7 +139,7 @@ BEGIN
RETURN NULL;
END;
$$
LANGUAGE plpgsql STABLE;
LANGUAGE plpgsql STABLE PARALLEL SAFE;
{% endif %}
@@ -166,7 +166,7 @@ BEGIN
END LOOP;
-- Nothing? Then try with the wikidata tag.
IF result.importance is null AND extratags ? 'wikidata' THEN
IF extratags ? 'wikidata' THEN
FOR match IN
{% if 'wikimedia_importance' in db.tables %}
SELECT * FROM wikimedia_importance
@@ -185,23 +185,23 @@ BEGIN
END IF;
-- Still nothing? Fall back to a default.
IF result.importance is null THEN
result.importance := 0.40001 - (rank_search::float / 75);
END IF;
result.importance := 0.40001 - (rank_search::float / 75);
{% if 'secondary_importance' in db.tables %}
FOR match IN
SELECT ST_Value(rast, centroid) as importance
FROM secondary_importance
WHERE ST_Intersects(ST_ConvexHull(rast), centroid) LIMIT 1
FROM secondary_importance
WHERE ST_Intersects(ST_ConvexHull(rast), centroid) LIMIT 1
LOOP
-- Secondary importance as tie breaker with 0.0001 weight.
result.importance := result.importance + match.importance::float / 655350000;
IF match.importance is not NULL THEN
-- Secondary importance as tie breaker with 0.0001 weight.
result.importance := result.importance + match.importance::float / 655350000;
END IF;
END LOOP;
{% endif %}
RETURN result;
END;
$$
LANGUAGE plpgsql;
LANGUAGE plpgsql PARALLEL SAFE;

View File

@@ -34,7 +34,7 @@ BEGIN
RETURN in_address;
END;
$$
LANGUAGE plpgsql STABLE;
LANGUAGE plpgsql STABLE PARALLEL SAFE;
@@ -70,7 +70,7 @@ BEGIN
RETURN parent_place_id;
END;
$$
LANGUAGE plpgsql STABLE;
LANGUAGE plpgsql STABLE PARALLEL SAFE;
CREATE OR REPLACE FUNCTION reinsert_interpolation(way_id BIGINT, addr HSTORE,
@@ -309,7 +309,7 @@ BEGIN
IF NEW.startnumber IS NULL THEN
NEW.startnumber := startnumber;
NEW.endnumber := endnumber;
NEW.linegeo := sectiongeo;
NEW.linegeo := ST_ReducePrecision(sectiongeo, 0.0000001);
NEW.postcode := postcode;
ELSE
INSERT INTO location_property_osmline
@@ -317,7 +317,8 @@ BEGIN
startnumber, endnumber, step,
address, postcode, country_code,
geometry_sector, indexed_status)
VALUES (sectiongeo, NEW.partition, NEW.osm_id, NEW.parent_place_id,
VALUES (ST_ReducePrecision(sectiongeo, 0.0000001),
NEW.partition, NEW.osm_id, NEW.parent_place_id,
startnumber, endnumber, NEW.step,
NEW.address, postcode,
NEW.country_code, NEW.geometry_sector, 0);

View File

@@ -2,7 +2,7 @@
--
-- This file is part of Nominatim. (https://nominatim.org)
--
-- Copyright (C) 2022 by the Nominatim developer community.
-- Copyright (C) 2026 by the Nominatim developer community.
-- For a full list of authors see the git log.
DROP TYPE IF EXISTS nearfeaturecentr CASCADE;
@@ -17,28 +17,6 @@ CREATE TYPE nearfeaturecentr AS (
centroid GEOMETRY
);
-- feature intersects geometry
-- for areas and linestrings they must touch at least along a line
CREATE OR REPLACE FUNCTION is_relevant_geometry(de9im TEXT, geom_type TEXT)
RETURNS BOOLEAN
AS $$
BEGIN
IF substring(de9im from 1 for 2) != 'FF' THEN
RETURN TRUE;
END IF;
IF geom_type = 'ST_Point' THEN
RETURN substring(de9im from 4 for 1) = '0';
END IF;
IF geom_type in ('ST_LineString', 'ST_MultiLineString') THEN
RETURN substring(de9im from 4 for 1) = '1';
END IF;
RETURN substring(de9im from 4 for 1) = '2';
END
$$ LANGUAGE plpgsql IMMUTABLE;
CREATE OR REPLACE function getNearFeatures(in_partition INTEGER, feature GEOMETRY,
feature_centroid GEOMETRY,
maxrank INTEGER)
@@ -59,7 +37,12 @@ BEGIN
isguess, postcode, centroid
FROM location_area_large_{{ partition }}
WHERE geometry && feature
AND is_relevant_geometry(ST_Relate(geometry, feature), ST_GeometryType(feature))
AND CASE WHEN ST_Dimension(feature) = 0
THEN _ST_Covers(geometry, feature)
WHEN ST_Dimension(feature) = 2
THEN ST_Relate(geometry, feature, 'T********')
ELSE ST_NPoints(ST_Intersection(geometry, feature)) > 1
END
AND rank_address < maxrank
-- Postcodes currently still use rank_search to define for which
-- features they are relevant.
@@ -75,7 +58,7 @@ BEGIN
RAISE EXCEPTION 'Unknown partition %', in_partition;
END
$$
LANGUAGE plpgsql STABLE;
LANGUAGE plpgsql STABLE PARALLEL SAFE;
CREATE OR REPLACE FUNCTION get_address_place(in_partition SMALLINT, feature GEOMETRY,
@@ -104,7 +87,7 @@ BEGIN
RAISE EXCEPTION 'Unknown partition %', in_partition;
END;
$$
LANGUAGE plpgsql STABLE;
LANGUAGE plpgsql STABLE PARALLEL SAFE;
create or replace function deleteLocationArea(in_partition INTEGER, in_place_id BIGINT, in_rank_search INTEGER) RETURNS BOOLEAN AS $$
@@ -140,16 +123,20 @@ BEGIN
RETURN TRUE;
END IF;
IF in_rank_search <= 4 and not in_estimate THEN
INSERT INTO location_area_country (place_id, country_code, geometry)
values (in_place_id, in_country_code, in_geometry);
IF in_rank_search <= 4 THEN
IF not in_estimate and in_country_code is not NULL THEN
INSERT INTO location_area_country (place_id, country_code, geometry)
(SELECT in_place_id, in_country_code, geom
FROM split_geometry(in_geometry) as geom);
END IF;
RETURN TRUE;
END IF;
{% for partition in db.partitions %}
IF in_partition = {{ partition }} THEN
INSERT INTO location_area_large_{{ partition }} (partition, place_id, country_code, keywords, rank_search, rank_address, isguess, postcode, centroid, geometry)
values (in_partition, in_place_id, in_country_code, in_keywords, in_rank_search, in_rank_address, in_estimate, postcode, in_centroid, in_geometry);
(SELECT in_partition, in_place_id, in_country_code, in_keywords, in_rank_search, in_rank_address, in_estimate, postcode, in_centroid, geom
FROM split_geometry(in_geometry) as geom);
RETURN TRUE;
END IF;
{% endfor %}
@@ -187,7 +174,7 @@ BEGIN
RAISE EXCEPTION 'Unknown partition %', in_partition;
END
$$
LANGUAGE plpgsql STABLE;
LANGUAGE plpgsql STABLE PARALLEL SAFE;
CREATE OR REPLACE FUNCTION getNearestNamedPlacePlaceId(in_partition INTEGER,
point GEOMETRY,
@@ -217,7 +204,7 @@ BEGIN
RAISE EXCEPTION 'Unknown partition %', in_partition;
END
$$
LANGUAGE plpgsql STABLE;
LANGUAGE plpgsql STABLE PARALLEL SAFE;
create or replace function insertSearchName(
in_partition INTEGER, in_place_id BIGINT, in_name_vector INTEGER[],
@@ -227,7 +214,6 @@ DECLARE
BEGIN
{% for partition in db.partitions %}
IF in_partition = {{ partition }} THEN
DELETE FROM search_name_{{ partition }} values WHERE place_id = in_place_id;
IF in_rank_address > 0 THEN
INSERT INTO search_name_{{ partition }} (place_id, address_rank, name_vector, centroid)
values (in_place_id, in_rank_address, in_name_vector, in_geometry);
@@ -266,7 +252,6 @@ BEGIN
{% for partition in db.partitions %}
IF in_partition = {{ partition }} THEN
DELETE FROM location_road_{{ partition }} where place_id = in_place_id;
INSERT INTO location_road_{{ partition }} (partition, place_id, country_code, geometry)
values (in_partition, in_place_id, in_country_code, in_geometry);
RETURN TRUE;
@@ -325,7 +310,7 @@ BEGIN
RAISE EXCEPTION 'Unknown partition %', in_partition;
END
$$
LANGUAGE plpgsql STABLE;
LANGUAGE plpgsql STABLE PARALLEL SAFE;
CREATE OR REPLACE FUNCTION getNearestParallelRoadFeature(in_partition INTEGER,
line GEOMETRY)
@@ -369,4 +354,4 @@ BEGIN
RAISE EXCEPTION 'Unknown partition %', in_partition;
END
$$
LANGUAGE plpgsql STABLE;
LANGUAGE plpgsql STABLE PARALLEL SAFE;

View File

@@ -2,7 +2,7 @@
--
-- This file is part of Nominatim. (https://nominatim.org)
--
-- Copyright (C) 2022 by the Nominatim developer community.
-- Copyright (C) 2026 by the Nominatim developer community.
-- For a full list of authors see the git log.
CREATE OR REPLACE FUNCTION place_insert()
@@ -66,7 +66,8 @@ BEGIN
-- They get their parent from the interpolation.
UPDATE placex p SET indexed_status = 2
FROM planet_osm_ways w
WHERE w.id = NEW.osm_id and p.osm_type = 'N' and p.osm_id = any(w.nodes);
WHERE w.id = NEW.osm_id and p.osm_type = 'N' and p.osm_id = any(w.nodes)
and indexed_status = 0;
-- If there is already an entry in place, just update that, if necessary.
IF existing.osm_type is not null THEN
@@ -89,35 +90,6 @@ BEGIN
RETURN NEW;
END IF;
-- ---- Postcode points.
IF NEW.class = 'place' AND NEW.type = 'postcode' THEN
-- Pure postcodes are never queried from placex so we don't add them.
-- location_postcodes is filled from the place table directly.
-- Remove any old placex entry.
DELETE FROM placex WHERE osm_type = NEW.osm_type and osm_id = NEW.osm_id;
IF existing.osm_type IS NOT NULL THEN
IF coalesce(existing.address, ''::hstore) != coalesce(NEW.address, ''::hstore)
OR existing.geometry::text != NEW.geometry::text
THEN
UPDATE place
SET name = NEW.name,
address = NEW.address,
extratags = NEW.extratags,
admin_level = NEW.admin_level,
geometry = NEW.geometry
WHERE osm_type = NEW.osm_type and osm_id = NEW.osm_id
and class = NEW.class and type = NEW.type;
END IF;
RETURN NULL;
END IF;
RETURN NEW;
END IF;
-- ---- All other place types.
-- When an area is changed from large to small: log and discard change
@@ -269,17 +241,6 @@ BEGIN
WHERE osm_type = NEW.osm_type and osm_id = NEW.osm_id
and class = NEW.class and type = NEW.type;
-- Postcode areas are only kept, when there is an actual postcode assigned.
IF NEW.class = 'boundary' AND NEW.type = 'postal_code' THEN
IF NEW.address is NULL OR NOT NEW.address ? 'postcode' THEN
-- postcode was deleted, no longer retain in placex
DELETE FROM placex where place_id = existingplacex.place_id;
RETURN NULL;
END IF;
NEW.name := hstore('ref', NEW.address->'postcode');
END IF;
-- Boundaries must be areas.
IF NEW.class in ('boundary')
AND ST_GeometryType(NEW.geometry) not in ('ST_Polygon','ST_MultiPolygon')
@@ -338,6 +299,11 @@ BEGIN
END IF;
END IF;
-- When an existing way is updated, recalculate entrances
IF existingplacex.osm_type = 'W' and (existingplacex.rank_search > 27 or existingplacex.class IN ('landuse', 'leisure')) THEN
PERFORM place_update_entrances(existingplacex.place_id, existingplacex.osm_id);
END IF;
-- Abort the insertion (we modified the existing place instead)
RETURN NULL;
END;

View File

@@ -2,7 +2,7 @@
--
-- This file is part of Nominatim. (https://nominatim.org)
--
-- Copyright (C) 2024 by the Nominatim developer community.
-- Copyright (C) 2026 by the Nominatim developer community.
-- For a full list of authors see the git log.
-- Trigger functions for the placex table.
@@ -109,7 +109,7 @@ BEGIN
RETURN result;
END;
$$
LANGUAGE plpgsql STABLE;
LANGUAGE plpgsql STABLE PARALLEL SAFE;
CREATE OR REPLACE FUNCTION find_associated_street(poi_osm_type CHAR(1),
@@ -200,7 +200,7 @@ BEGIN
RETURN result;
END;
$$
LANGUAGE plpgsql STABLE;
LANGUAGE plpgsql STABLE PARALLEL SAFE;
-- Find the parent road of a POI.
@@ -286,7 +286,7 @@ BEGIN
RETURN parent_place_id;
END;
$$
LANGUAGE plpgsql STABLE;
LANGUAGE plpgsql STABLE PARALLEL SAFE;
-- Try to find a linked place for the given object.
CREATE OR REPLACE FUNCTION find_linked_place(bnd placex)
@@ -304,7 +304,6 @@ DECLARE
BEGIN
IF bnd.rank_search >= 26 or bnd.rank_address = 0
or ST_GeometryType(bnd.geometry) NOT IN ('ST_Polygon','ST_MultiPolygon')
or bnd.type IN ('postcode', 'postal_code')
THEN
RETURN NULL;
END IF;
@@ -341,26 +340,6 @@ BEGIN
END IF;
END IF;
-- If extratags has a place tag, look for linked nodes by their place type.
-- Area and node still have to have the same name.
IF bnd.extratags ? 'place' and bnd.extratags->'place' != 'postcode'
and bnd_name is not null
THEN
FOR linked_placex IN
SELECT * FROM placex
WHERE (position(lower(name->'name') in bnd_name) > 0
OR position(bnd_name in lower(name->'name')) > 0)
AND placex.class = 'place' AND placex.type = bnd.extratags->'place'
AND placex.osm_type = 'N'
AND (placex.linked_place_id is null or placex.linked_place_id = bnd.place_id)
AND placex.rank_search < 26 -- needed to select the right index
AND ST_Covers(bnd.geometry, placex.geometry)
LOOP
{% if debug %}RAISE WARNING 'Found type-matching place node %', linked_placex.osm_id;{% endif %}
RETURN linked_placex;
END LOOP;
END IF;
IF bnd.extratags ? 'wikidata' THEN
FOR linked_placex IN
SELECT * FROM placex
@@ -377,6 +356,25 @@ BEGIN
END LOOP;
END IF;
-- If extratags has a place tag, look for linked nodes by their place type.
-- Area and node still have to have the same name.
IF bnd.extratags ? 'place' and bnd_name is not null
THEN
FOR linked_placex IN
SELECT * FROM placex
WHERE (position(lower(name->'name') in bnd_name) > 0
OR position(bnd_name in lower(name->'name')) > 0)
AND placex.class = 'place' AND placex.type = bnd.extratags->'place'
AND placex.osm_type = 'N'
AND (placex.linked_place_id is null or placex.linked_place_id = bnd.place_id)
AND placex.rank_search < 26 -- needed to select the right index
AND ST_Covers(bnd.geometry, placex.geometry)
LOOP
{% if debug %}RAISE WARNING 'Found type-matching place node %', linked_placex.osm_id;{% endif %}
RETURN linked_placex;
END LOOP;
END IF;
-- Name searches can be done for ways as well as relations
IF bnd_name is not null THEN
{% if debug %}RAISE WARNING 'Looking for nodes with matching names';{% endif %}
@@ -393,7 +391,6 @@ BEGIN
AND placex.class = 'place'
AND (placex.linked_place_id is null or placex.linked_place_id = bnd.place_id)
AND placex.rank_search < 26 -- needed to select the right index
AND placex.type != 'postcode'
AND ST_Covers(bnd.geometry, placex.geometry)
LOOP
{% if debug %}RAISE WARNING 'Found matching place node %', linked_placex.osm_id;{% endif %}
@@ -404,7 +401,7 @@ BEGIN
RETURN NULL;
END;
$$
LANGUAGE plpgsql STABLE;
LANGUAGE plpgsql STABLE PARALLEL SAFE;
CREATE OR REPLACE FUNCTION create_poi_search_terms(obj_place_id BIGINT,
@@ -468,7 +465,7 @@ BEGIN
END IF;
END LOOP;
name_vector := token_get_name_search_tokens(token_info);
name_vector := COALESCE(token_get_name_search_tokens(token_info), '{}'::INTEGER[]);
-- Check if the parent covers all address terms.
-- If not, create a search name entry with the house number as the name.
@@ -530,6 +527,7 @@ CREATE OR REPLACE FUNCTION insert_addresslines(obj_place_id BIGINT,
AS $$
DECLARE
address_havelevel BOOLEAN[];
place_min_distance FLOAT[];
location_isaddress BOOLEAN;
current_boundary GEOMETRY := NULL;
@@ -545,6 +543,7 @@ BEGIN
nameaddress_vector := '{}'::int[];
address_havelevel := array_fill(false, ARRAY[maxrank]);
place_min_distance := array_fill(1.0, ARRAY[maxrank]);
FOR location IN
SELECT apl.*, key
@@ -575,6 +574,10 @@ BEGIN
END IF;
END IF;
IF location.isguess and location.distance < place_min_distance[location.rank_address] THEN
place_min_distance[location.rank_address] := location.distance;
END IF;
INSERT INTO place_addressline (place_id, address_place_id, fromarea,
isaddress, distance, cached_rank_address)
VALUES (obj_place_id, location.place_id, not location.isguess,
@@ -602,6 +605,16 @@ BEGIN
-- If this is the first item in the rank, then assume it is the address.
location_isaddress := not address_havelevel[location.rank_address];
-- Ignore guessed places when they are too far away compared to similar closer ones.
IF location.isguess THEN
CONTINUE WHEN not location_isaddress
AND location.distance > 2 * place_min_distance[location.rank_address];
IF location.distance < place_min_distance[location.rank_address] THEN
place_min_distance[location.rank_address] := location.distance;
END IF;
END IF;
-- Further sanity checks to ensure that the address forms a sane hierarchy.
IF location_isaddress THEN
IF location.isguess and current_node_area is not NULL THEN
@@ -638,8 +651,10 @@ BEGIN
-- Add it to the list of search terms
{% if not db.reverse_only %}
nameaddress_vector := array_merge(nameaddress_vector,
location.keywords::integer[]);
IF location.rank_address != 11 AND location.rank_address != 5 THEN
nameaddress_vector := array_merge(nameaddress_vector,
location.keywords::integer[]);
END IF;
{% endif %}
INSERT INTO place_addressline (place_id, address_place_id, fromarea,
@@ -679,17 +694,7 @@ BEGIN
ELSE
is_area := ST_GeometryType(NEW.geometry) IN ('ST_Polygon','ST_MultiPolygon');
IF NEW.class in ('place','boundary')
AND NEW.type in ('postcode','postal_code')
THEN
IF NEW.address IS NULL OR NOT NEW.address ? 'postcode' THEN
-- most likely just a part of a multipolygon postcode boundary, throw it away
RETURN NULL;
END IF;
NEW.name := hstore('ref', NEW.address->'postcode');
ELSEIF NEW.class = 'highway' AND is_area AND NEW.name is null
IF NEW.class = 'highway' AND is_area AND NEW.name is null
AND NEW.extratags ? 'area' AND NEW.extratags->'area' = 'yes'
THEN
RETURN NULL;
@@ -730,7 +735,7 @@ BEGIN
IF NEW.rank_address between 2 and 27 THEN
IF (ST_GeometryType(NEW.geometry) in ('ST_Polygon','ST_MultiPolygon') AND ST_IsValid(NEW.geometry)) THEN
-- Performance: We just can't handle re-indexing for country level changes
IF (NEW.rank_address < 26 and st_area(NEW.geometry) < 1)
IF (NEW.rank_address < 26 and st_area(NEW.geometry) <= 2)
OR (NEW.rank_address >= 26 and st_area(NEW.geometry) < 0.01)
THEN
-- mark items within the geometry for re-indexing
@@ -778,7 +783,7 @@ BEGIN
SELECT count(*)>0 FROM pg_tables WHERE tablename = classtable and schemaname = current_schema() INTO result;
IF result THEN
EXECUTE 'INSERT INTO ' || classtable::regclass || ' (place_id, centroid) VALUES ($1,$2)'
USING NEW.place_id, ST_Centroid(NEW.geometry);
USING NEW.place_id, NEW.centroid;
END IF;
{% endif %} -- not disable_diff_updates
@@ -835,13 +840,15 @@ BEGIN
NEW.indexed_date = now();
{% if 'search_name' in db.tables %}
DELETE from search_name WHERE place_id = NEW.place_id;
{% endif %}
result := deleteSearchName(NEW.partition, NEW.place_id);
DELETE FROM place_addressline WHERE place_id = NEW.place_id;
result := deleteRoad(NEW.partition, NEW.place_id);
result := deleteLocationArea(NEW.partition, NEW.place_id, NEW.rank_search);
IF OLD.indexed_status > 1 THEN
{% if 'search_name' in db.tables %}
DELETE from search_name WHERE place_id = NEW.place_id;
{% endif %}
result := deleteSearchName(NEW.partition, NEW.place_id);
DELETE FROM place_addressline WHERE place_id = NEW.place_id;
result := deleteRoad(NEW.partition, NEW.place_id);
result := deleteLocationArea(NEW.partition, NEW.place_id, NEW.rank_search);
END IF;
NEW.extratags := NEW.extratags - 'linked_place'::TEXT;
IF NEW.extratags = ''::hstore THEN
@@ -854,24 +861,22 @@ BEGIN
NEW.linked_place_id := OLD.linked_place_id;
-- Remove linkage, if we have computed a different new linkee.
UPDATE placex SET linked_place_id = null, indexed_status = 2
WHERE linked_place_id = NEW.place_id
and (linked_place is null or linked_place_id != linked_place);
-- update not necessary for osmline, cause linked_place_id does not exist
-- Postcodes are just here to compute the centroids. They are not searchable
-- unless they are a boundary=postal_code.
-- There was an error in the style so that boundary=postal_code used to be
-- imported as place=postcode. That's why relations are allowed to pass here.
-- This can go away in a couple of versions.
IF NEW.class = 'place' and NEW.type = 'postcode' and NEW.osm_type != 'R' THEN
NEW.token_info := null;
RETURN NEW;
IF OLD.indexed_status > 1 THEN
UPDATE placex
SET linked_place_id = null,
indexed_status = CASE WHEN indexed_status = 0 THEN 2 ELSE indexed_status END
WHERE linked_place_id = NEW.place_id
and (linked_place is null or place_id != linked_place);
END IF;
-- Compute a preliminary centroid.
NEW.centroid := get_center_point(NEW.geometry);
-- Record the entrance node locations
IF NEW.osm_type = 'W' and (NEW.rank_search > 27 or NEW.class IN ('landuse', 'leisure')) THEN
PERFORM place_update_entrances(NEW.place_id, NEW.osm_id);
END IF;
-- recalculate country and partition
IF NEW.rank_search = 4 AND NEW.address is not NULL AND NEW.address ? 'country' THEN
-- for countries, believe the mapped country code,
@@ -960,9 +965,8 @@ BEGIN
WHERE class = 'place' and rank_address between 1 and 23
and prank.address_rank >= NEW.rank_address
and ST_GeometryType(geometry) in ('ST_Polygon','ST_MultiPolygon') -- select right index
and geometry && NEW.geometry
and geometry ~ NEW.geometry -- needed because ST_Relate does not do bbox cover test
and ST_Relate(geometry, NEW.geometry, 'T*T***FF*') -- contains but not equal
and ST_Contains(geometry, NEW.geometry)
and not ST_Equals(geometry, NEW.geometry)
ORDER BY prank.address_rank desc LIMIT 1
LOOP
NEW.rank_address := location.rank_address + 2;
@@ -983,9 +987,8 @@ BEGIN
and rank_address between 1 and 25 -- select right index
and ST_GeometryType(geometry) in ('ST_Polygon','ST_MultiPolygon') -- select right index
and prank.address_rank >= NEW.rank_address
and geometry && NEW.geometry
and geometry ~ NEW.geometry -- needed because ST_Relate does not do bbox cover test
and ST_Relate(geometry, NEW.geometry, 'T*T***FF*') -- contains but not equal
and ST_Contains(geometry, NEW.geometry)
and not ST_Equals(geometry, NEW.geometry)
ORDER BY prank.address_rank desc LIMIT 1
LOOP
NEW.rank_address := location.rank_address + 2;
@@ -1034,7 +1037,9 @@ BEGIN
LOOP
UPDATE placex SET linked_place_id = NEW.place_id WHERE place_id = linked_node_id;
{% if 'search_name' in db.tables %}
DELETE FROM search_name WHERE place_id = linked_node_id;
IF OLD.indexed_status > 1 THEN
DELETE FROM search_name WHERE place_id = linked_node_id;
END IF;
{% endif %}
END LOOP;
END IF;
@@ -1183,11 +1188,6 @@ BEGIN
-- reset the address rank if necessary.
UPDATE placex set linked_place_id = NEW.place_id, indexed_status = 2
WHERE place_id = location.place_id;
-- ensure that those places are not found anymore
{% if 'search_name' in db.tables %}
DELETE FROM search_name WHERE place_id = location.place_id;
{% endif %}
PERFORM deleteLocationArea(NEW.partition, location.place_id, NEW.rank_search);
SELECT wikipedia, importance
FROM compute_importance(location.extratags, NEW.country_code,
@@ -1198,7 +1198,7 @@ BEGIN
IF linked_importance is not null AND
(NEW.importance is null or NEW.importance < linked_importance)
THEN
NEW.importance = linked_importance;
NEW.importance := linked_importance;
END IF;
ELSE
-- No linked place? As a last resort check if the boundary is tagged with
@@ -1240,7 +1240,7 @@ BEGIN
LIMIT 1
LOOP
IF location.osm_id = NEW.osm_id THEN
{% if debug %}RAISE WARNING 'Updating names for country '%' with: %', NEW.country_code, NEW.name;{% endif %}
{% if debug %}RAISE WARNING 'Updating names for country ''%'' with: %', NEW.country_code, NEW.name;{% endif %}
UPDATE country_name SET derived_name = NEW.name WHERE country_code = NEW.country_code;
END IF;
END LOOP;
@@ -1265,8 +1265,6 @@ BEGIN
END IF;
ELSEIF NEW.rank_address > 25 THEN
max_rank := 25;
ELSEIF NEW.class in ('place','boundary') and NEW.type in ('postcode','postal_code') THEN
max_rank := NEW.rank_search;
ELSE
max_rank := NEW.rank_address;
END IF;
@@ -1281,10 +1279,10 @@ BEGIN
NEW.postcode := coalesce(token_get_postcode(NEW.token_info), NEW.postcode);
-- if we have a name add this to the name search table
IF NEW.name IS NOT NULL THEN
name_vector := token_get_name_search_tokens(NEW.token_info);
IF array_length(name_vector, 1) is not NULL THEN
-- Initialise the name vector using our name
NEW.name := add_default_place_name(NEW.country_code, NEW.name);
name_vector := token_get_name_search_tokens(NEW.token_info);
IF NEW.rank_search <= 25 and NEW.rank_address > 0 THEN
result := add_location(NEW.place_id, NEW.country_code, NEW.partition,
@@ -1344,10 +1342,10 @@ BEGIN
-- RAISE WARNING 'placex_delete % %',OLD.osm_type,OLD.osm_id;
IF OLD.linked_place_id is null THEN
update placex set linked_place_id = null, indexed_status = 2 where linked_place_id = OLD.place_id and indexed_status = 0;
{% if debug %}RAISE WARNING 'placex_delete:01 % %',OLD.osm_type,OLD.osm_id;{% endif %}
update placex set linked_place_id = null where linked_place_id = OLD.place_id;
{% if debug %}RAISE WARNING 'placex_delete:02 % %',OLD.osm_type,OLD.osm_id;{% endif %}
UPDATE placex
SET linked_place_id = NULL,
indexed_status = CASE WHEN indexed_status = 0 THEN 2 ELSE indexed_status END
WHERE linked_place_id = OLD.place_id;
ELSE
update placex set indexed_status = 2 where place_id = OLD.linked_place_id and indexed_status = 0;
END IF;
@@ -1371,6 +1369,7 @@ BEGIN
-- reparenting also for OSM Interpolation Lines (and for Tiger?)
update location_property_osmline set indexed_status = 2 where indexed_status = 0 and parent_place_id = OLD.place_id;
UPDATE location_postcodes SET indexed_status = 2 WHERE parent_place_id = OLD.place_id;
END IF;
{% if debug %}RAISE WARNING 'placex_delete:08 % %',OLD.osm_type,OLD.osm_id;{% endif %}
@@ -1402,9 +1401,6 @@ BEGIN
END IF;
{% if debug %}RAISE WARNING 'placex_delete:12 % %',OLD.osm_type,OLD.osm_id;{% endif %}
UPDATE location_postcode SET indexed_status = 2 WHERE parent_place_id = OLD.place_id;
RETURN OLD;
END;

View File

@@ -2,10 +2,10 @@
--
-- This file is part of Nominatim. (https://nominatim.org)
--
-- Copyright (C) 2022 by the Nominatim developer community.
-- Copyright (C) 2025 by the Nominatim developer community.
-- For a full list of authors see the git log.
-- Trigger functions for location_postcode table.
-- Trigger functions for location_postcodes table.
-- Trigger for updates of location_postcode
@@ -13,7 +13,7 @@
-- Computes the parent object the postcode most likely refers to.
-- This will be the place that determines the address displayed when
-- searching for this postcode.
CREATE OR REPLACE FUNCTION postcode_update()
CREATE OR REPLACE FUNCTION postcodes_update()
RETURNS TRIGGER
AS $$
DECLARE
@@ -28,13 +28,10 @@ BEGIN
partition := get_partition(NEW.country_code);
SELECT * FROM get_postcode_rank(NEW.country_code, NEW.postcode)
INTO NEW.rank_search, NEW.rank_address;
NEW.parent_place_id = 0;
FOR location IN
SELECT place_id
FROM getNearFeatures(partition, NEW.geometry, NEW.geometry, NEW.rank_search)
FROM getNearFeatures(partition, NEW.centroid, NEW.centroid, NEW.rank_search)
WHERE NOT isguess ORDER BY rank_address DESC, distance asc LIMIT 1
LOOP
NEW.parent_place_id = location.place_id;
@@ -45,3 +42,89 @@ END;
$$
LANGUAGE plpgsql;
CREATE OR REPLACE FUNCTION postcodes_delete()
RETURNS TRIGGER
AS $$
BEGIN
{% if not disable_diff_updates %}
UPDATE placex p SET indexed_status = 2
WHERE p.postcode = OLD.postcode AND ST_Intersects(OLD.geometry, p.geometry)
AND indexed_status = 0;
{% endif %}
RETURN OLD;
END;
$$
LANGUAGE plpgsql;
CREATE OR REPLACE FUNCTION postcodes_insert()
RETURNS TRIGGER
AS $$
DECLARE
existing RECORD;
BEGIN
IF NEW.osm_id is not NULL THEN
-- postcode area, remove existing from same OSM object
SELECT * INTO existing FROM location_postcodes p
WHERE p.osm_id = NEW.osm_id;
IF existing.place_id is not NULL THEN
IF existing.postcode != NEW.postcode or existing.country_code != NEW.country_code THEN
DELETE FROM location_postcodes p WHERE p.osm_id = NEW.osm_id;
existing := NULL;
END IF;
END IF;
END IF;
IF existing is NULL THEN
SELECT * INTO existing FROM location_postcodes p
WHERE p.country_code = NEW.country_code AND p.postcode = NEW.postcode;
IF existing.postcode is NULL THEN
{% if not disable_diff_updates %}
UPDATE placex p SET indexed_status = 2
WHERE ST_Intersects(NEW.geometry, p.geometry)
AND indexed_status = 0
AND p.rank_address >= 22 AND not p.address ? 'postcode';
{% endif %}
-- new entry, just insert
NEW.indexed_status := 1;
NEW.place_id := nextval('seq_place');
RETURN NEW;
END IF;
END IF;
-- update: only when there are changes
IF coalesce(NEW.osm_id, -1) != coalesce(existing.osm_id, -1)
OR (NEW.osm_id is not null AND NEW.geometry::text != existing.geometry::text)
OR (NEW.osm_id is null
AND (abs(ST_X(existing.centroid) - ST_X(NEW.centroid)) > 0.0000001
OR abs(ST_Y(existing.centroid) - ST_Y(NEW.centroid)) > 0.0000001))
THEN
{% if not disable_diff_updates %}
UPDATE placex p SET indexed_status = 2
WHERE ST_Intersects(ST_Difference(NEW.geometry, existing.geometry), p.geometry)
AND indexed_status = 0
AND p.rank_address >= 22 AND not p.address ? 'postcode';
UPDATE placex p SET indexed_status = 2
WHERE ST_Intersects(ST_Difference(existing.geometry, NEW.geometry), p.geometry)
AND indexed_status = 0
AND p.postcode = OLD.postcode;
{% endif %}
UPDATE location_postcodes p
SET osm_id = NEW.osm_id,
indexed_status = 2,
centroid = NEW.centroid,
geometry = NEW.geometry
WHERE p.country_code = NEW.country_code AND p.postcode = NEW.postcode;
END IF;
RETURN NULL;
END;
$$
LANGUAGE plpgsql;

View File

@@ -2,7 +2,7 @@
--
-- This file is part of Nominatim. (https://nominatim.org)
--
-- Copyright (C) 2022 by the Nominatim developer community.
-- Copyright (C) 2025 by the Nominatim developer community.
-- For a full list of authors see the git log.
-- Functions related to search and address ranks
@@ -29,7 +29,7 @@ BEGIN
RETURN 0.02;
END;
$$
LANGUAGE plpgsql IMMUTABLE;
LANGUAGE plpgsql IMMUTABLE PARALLEL SAFE;
-- Return an approximate update radius according to the search rank.
@@ -60,7 +60,7 @@ BEGIN
RETURN 0;
END;
$$
LANGUAGE plpgsql IMMUTABLE;
LANGUAGE plpgsql IMMUTABLE PARALLEL SAFE;
-- Compute a base address rank from the extent of the given geometry.
--
@@ -88,6 +88,10 @@ BEGIN
area := area / 3;
ELSIF country_code IN ('bo', 'ar', 'sd', 'mn', 'in', 'et', 'cd', 'mz', 'ly', 'cl', 'zm') THEN
area := area / 2;
ELSIF country_code IN ('sg', 'ws', 'st', 'kn') THEN
area := area * 5;
ELSIF country_code IN ('dm', 'mt', 'lc', 'gg', 'sc', 'nr') THEN
area := area * 20;
END IF;
IF area > 1 THEN
@@ -107,67 +111,7 @@ BEGIN
RETURN 23;
END;
$$
LANGUAGE plpgsql IMMUTABLE;
-- Guess a ranking for postcodes from country and postcode format.
CREATE OR REPLACE FUNCTION get_postcode_rank(country_code VARCHAR(2), postcode TEXT,
OUT rank_search SMALLINT,
OUT rank_address SMALLINT)
AS $$
DECLARE
part TEXT;
BEGIN
rank_search := 30;
rank_address := 30;
postcode := upper(postcode);
IF country_code = 'gb' THEN
IF postcode ~ '^([A-Z][A-Z]?[0-9][0-9A-Z]? [0-9][A-Z][A-Z])$' THEN
rank_search := 25;
rank_address := 5;
ELSEIF postcode ~ '^([A-Z][A-Z]?[0-9][0-9A-Z]? [0-9])$' THEN
rank_search := 23;
rank_address := 5;
ELSEIF postcode ~ '^([A-Z][A-Z]?[0-9][0-9A-Z])$' THEN
rank_search := 21;
rank_address := 5;
END IF;
ELSEIF country_code = 'sg' THEN
IF postcode ~ '^([0-9]{6})$' THEN
rank_search := 25;
rank_address := 11;
END IF;
ELSEIF country_code = 'de' THEN
IF postcode ~ '^([0-9]{5})$' THEN
rank_search := 21;
rank_address := 11;
END IF;
ELSE
-- Guess at the postcode format and coverage (!)
IF postcode ~ '^[A-Z0-9]{1,5}$' THEN -- Probably too short to be very local
rank_search := 21;
rank_address := 11;
ELSE
-- Does it look splitable into and area and local code?
part := substring(postcode from '^([- :A-Z0-9]+)([- :][A-Z0-9]+)$');
IF part IS NOT NULL THEN
rank_search := 25;
rank_address := 11;
ELSEIF postcode ~ '^[- :A-Z0-9]{6,}$' THEN
rank_search := 21;
rank_address := 11;
END IF;
END IF;
END IF;
END;
$$
LANGUAGE plpgsql IMMUTABLE;
LANGUAGE plpgsql IMMUTABLE PARALLEL SAFE;
-- Get standard search and address rank for an object.
@@ -194,12 +138,7 @@ AS $$
DECLARE
classtype TEXT;
BEGIN
IF place_class in ('place','boundary')
and place_type in ('postcode','postal_code')
THEN
SELECT * INTO search_rank, address_rank
FROM get_postcode_rank(country, postcode);
ELSEIF extended_type = 'N' AND place_class = 'highway' THEN
IF extended_type = 'N' AND place_class = 'highway' THEN
search_rank = 30;
address_rank = 30;
ELSEIF place_class = 'landuse' AND extended_type != 'A' THEN
@@ -236,7 +175,7 @@ BEGIN
END IF;
END;
$$
LANGUAGE plpgsql IMMUTABLE;
LANGUAGE plpgsql IMMUTABLE PARALLEL SAFE;
CREATE OR REPLACE FUNCTION get_addr_tag_rank(key TEXT, country TEXT,
OUT from_rank SMALLINT,
@@ -283,7 +222,7 @@ BEGIN
END LOOP;
END;
$$
LANGUAGE plpgsql IMMUTABLE;
LANGUAGE plpgsql IMMUTABLE PARALLEL SAFE;
CREATE OR REPLACE FUNCTION weigh_search(search_vector INT[],
@@ -304,4 +243,4 @@ BEGIN
RETURN def_weight;
END;
$$
LANGUAGE plpgsql IMMUTABLE;
LANGUAGE plpgsql IMMUTABLE PARALLEL SAFE;

View File

@@ -2,7 +2,7 @@
--
-- This file is part of Nominatim. (https://nominatim.org)
--
-- Copyright (C) 2022 by the Nominatim developer community.
-- Copyright (C) 2026 by the Nominatim developer community.
-- For a full list of authors see the git log.
-- Assorted helper functions for the triggers.
@@ -14,17 +14,17 @@ DECLARE
geom_type TEXT;
BEGIN
geom_type := ST_GeometryType(place);
IF geom_type = ' ST_Point' THEN
IF geom_type = 'ST_Point' THEN
RETURN place;
END IF;
IF geom_type = 'ST_LineString' THEN
RETURN ST_LineInterpolatePoint(place, 0.5);
RETURN ST_ReducePrecision(ST_LineInterpolatePoint(place, 0.5), 0.0000001);
END IF;
RETURN ST_PointOnSurface(place);
RETURN ST_ReducePrecision(ST_PointOnSurface(place), 0.0000001);
END;
$$
LANGUAGE plpgsql IMMUTABLE;
LANGUAGE plpgsql IMMUTABLE PARALLEL SAFE;
CREATE OR REPLACE FUNCTION geometry_sector(partition INTEGER, place GEOMETRY)
@@ -34,7 +34,7 @@ BEGIN
RETURN (partition*1000000) + (500-ST_X(place)::INTEGER)*1000 + (500-ST_Y(place)::INTEGER);
END;
$$
LANGUAGE plpgsql IMMUTABLE;
LANGUAGE plpgsql IMMUTABLE PARALLEL SAFE;
@@ -46,13 +46,13 @@ DECLARE
r INTEGER[];
BEGIN
IF array_upper(a, 1) IS NULL THEN
RETURN b;
RETURN COALESCE(b, '{}'::INTEGER[]);
END IF;
IF array_upper(b, 1) IS NULL THEN
RETURN a;
RETURN COALESCE(a, '{}'::INTEGER[]);
END IF;
r := a;
FOR i IN 1..array_upper(b, 1) LOOP
FOR i IN 1..array_upper(b, 1) LOOP
IF NOT (ARRAY[b[i]] <@ r) THEN
r := r || b[i];
END IF;
@@ -60,7 +60,7 @@ BEGIN
RETURN r;
END;
$$
LANGUAGE plpgsql IMMUTABLE;
LANGUAGE plpgsql IMMUTABLE PARALLEL SAFE;
-- Return the node members with a given label from a relation member list
-- as a set.
@@ -88,7 +88,7 @@ BEGIN
RETURN;
END;
$$
LANGUAGE plpgsql IMMUTABLE;
LANGUAGE plpgsql IMMUTABLE PARALLEL SAFE;
CREATE OR REPLACE FUNCTION get_rel_node_members(members JSONB, memberLabels TEXT[])
@@ -107,7 +107,7 @@ BEGIN
RETURN;
END;
$$
LANGUAGE plpgsql IMMUTABLE;
LANGUAGE plpgsql IMMUTABLE PARALLEL SAFE;
-- Copy 'name' to or from the default language.
@@ -136,43 +136,52 @@ BEGIN
END IF;
END;
$$
LANGUAGE plpgsql IMMUTABLE;
LANGUAGE plpgsql IMMUTABLE PARALLEL SAFE;
-- Find the nearest artificial postcode for the given geometry.
-- TODO For areas there should not be more than two inside the geometry.
-- Find the best-matching postcode for the given geometry
CREATE OR REPLACE FUNCTION get_nearest_postcode(country VARCHAR(2), geom GEOMETRY)
RETURNS TEXT
AS $$
DECLARE
outcode TEXT;
cnt INTEGER;
location RECORD;
BEGIN
-- If the geometry is an area then only one postcode must be within
-- that area, otherwise consider the area as not having a postcode.
IF ST_GeometryType(geom) in ('ST_Polygon','ST_MultiPolygon') THEN
SELECT min(postcode), count(*) FROM
(SELECT postcode FROM location_postcode
WHERE ST_Contains(geom, location_postcode.geometry) LIMIT 2) sub
INTO outcode, cnt;
SELECT min(postcode), count(*) FROM
(SELECT postcode FROM location_postcodes
WHERE geom && location_postcodes.geometry -- want to use the index
AND ST_Contains(geom, location_postcodes.centroid)
AND country_code = country
LIMIT 2) sub
INTO outcode, cnt;
IF cnt = 1 THEN
RETURN outcode;
ELSE
RETURN null;
END IF;
RETURN null;
END IF;
SELECT postcode FROM location_postcode
WHERE ST_DWithin(geom, location_postcode.geometry, 0.05)
AND location_postcode.country_code = country
ORDER BY ST_Distance(geom, location_postcode.geometry) LIMIT 1
INTO outcode;
-- Otherwise: be fully within the coverage area of a postcode
FOR location IN
SELECT postcode
FROM location_postcodes p
WHERE ST_Covers(p.geometry, geom)
AND p.country_code = country
ORDER BY osm_id is null, ST_Distance(p.centroid, geom)
LIMIT 1
LOOP
RETURN location.postcode;
END LOOP;
RETURN outcode;
RETURN NULL;
END;
$$
LANGUAGE plpgsql STABLE;
LANGUAGE plpgsql STABLE PARALLEL SAFE;
CREATE OR REPLACE FUNCTION get_country_code(place geometry)
@@ -233,7 +242,7 @@ BEGIN
RETURN NULL;
END;
$$
LANGUAGE plpgsql STABLE;
LANGUAGE plpgsql STABLE PARALLEL SAFE;
CREATE OR REPLACE FUNCTION get_country_language_code(search_country_code VARCHAR(2))
@@ -251,7 +260,7 @@ BEGIN
RETURN NULL;
END;
$$
LANGUAGE plpgsql STABLE;
LANGUAGE plpgsql STABLE PARALLEL SAFE;
CREATE OR REPLACE FUNCTION get_partition(in_country_code VARCHAR(10))
@@ -268,7 +277,7 @@ BEGIN
RETURN 0;
END;
$$
LANGUAGE plpgsql STABLE;
LANGUAGE plpgsql STABLE PARALLEL SAFE;
-- Find the parent of an address with addr:street/addr:place tag.
@@ -299,7 +308,7 @@ BEGIN
RETURN parent_place_id;
END;
$$
LANGUAGE plpgsql STABLE;
LANGUAGE plpgsql STABLE PARALLEL SAFE;
CREATE OR REPLACE FUNCTION delete_location(OLD_place_id BIGINT)
@@ -314,6 +323,17 @@ END;
$$
LANGUAGE plpgsql;
-- Return the bounding box of the geometry buffered by the given number
-- of meters.
CREATE OR REPLACE FUNCTION expand_by_meters(geom GEOMETRY, meters FLOAT)
RETURNS GEOMETRY
AS $$
SELECT ST_Envelope(ST_Buffer(geom::geography, meters, 1)::geometry)
$$
LANGUAGE sql IMMUTABLE PARALLEL SAFE;
-- Create a bounding box with an extent computed from the radius (in meters)
-- which in turn is derived from the given search rank.
CREATE OR REPLACE FUNCTION place_node_fuzzy_area(geom GEOMETRY, rank_search INTEGER)
@@ -332,12 +352,10 @@ BEGIN
radius := 1000;
END IF;
RETURN ST_Envelope(ST_Collect(
ST_Project(geom::geography, radius, 0.785398)::geometry,
ST_Project(geom::geography, radius, 3.9269908)::geometry));
RETURN expand_by_meters(geom, radius);
END;
$$
LANGUAGE plpgsql IMMUTABLE;
LANGUAGE plpgsql IMMUTABLE PARALLEL SAFE;
CREATE OR REPLACE FUNCTION add_location(place_id BIGINT, country_code varchar(2),
@@ -348,30 +366,27 @@ CREATE OR REPLACE FUNCTION add_location(place_id BIGINT, country_code varchar(2)
RETURNS BOOLEAN
AS $$
DECLARE
locationid INTEGER;
secgeo GEOMETRY;
postcode TEXT;
BEGIN
PERFORM deleteLocationArea(partition, place_id, rank_search);
-- add postcode only if it contains a single entry, i.e. ignore postcode lists
postcode := NULL;
IF in_postcode is not null AND in_postcode not similar to '%(,|;)%' THEN
postcode := upper(trim (in_postcode));
END IF;
IF ST_GeometryType(geometry) in ('ST_Polygon','ST_MultiPolygon') THEN
FOR secgeo IN select split_geometry(geometry) AS geom LOOP
PERFORM insertLocationAreaLarge(partition, place_id, country_code, keywords, rank_search, rank_address, false, postcode, centroid, secgeo);
END LOOP;
ELSEIF ST_GeometryType(geometry) = 'ST_Point' THEN
secgeo := place_node_fuzzy_area(geometry, rank_search);
PERFORM insertLocationAreaLarge(partition, place_id, country_code, keywords, rank_search, rank_address, true, postcode, centroid, secgeo);
IF ST_Dimension(geometry) = 2 THEN
RETURN insertLocationAreaLarge(partition, place_id, country_code, keywords,
rank_search, rank_address, false, postcode,
centroid, geometry);
END IF;
RETURN true;
IF ST_Dimension(geometry) = 0 THEN
RETURN insertLocationAreaLarge(partition, place_id, country_code, keywords,
rank_search, rank_address, true, postcode,
centroid, place_node_fuzzy_area(geometry, rank_search));
END IF;
RETURN false;
END;
$$
LANGUAGE plpgsql;
@@ -394,19 +409,21 @@ DECLARE
geo RECORD;
area FLOAT;
remainingdepth INTEGER;
added INTEGER;
BEGIN
-- RAISE WARNING 'quad_split_geometry: maxarea=%, depth=%',maxarea,maxdepth;
IF (ST_GeometryType(geometry) not in ('ST_Polygon','ST_MultiPolygon') OR NOT ST_IsValid(geometry)) THEN
IF not ST_IsValid(geometry) THEN
RETURN;
END IF;
IF ST_Dimension(geometry) != 2 OR maxdepth <= 1 THEN
RETURN NEXT geometry;
RETURN;
END IF;
remainingdepth := maxdepth - 1;
area := ST_AREA(geometry);
IF remainingdepth < 1 OR area < maxarea THEN
IF area < maxarea THEN
RETURN NEXT geometry;
RETURN;
END IF;
@@ -426,7 +443,6 @@ BEGIN
xmid := (xmin+xmax)/2;
ymid := (ymin+ymax)/2;
added := 0;
FOR seg IN 1..4 LOOP
IF seg = 1 THEN
@@ -442,23 +458,20 @@ BEGIN
secbox := ST_SetSRID(ST_MakeBox2D(ST_Point(xmid,ymid),ST_Point(xmax,ymax)),4326);
END IF;
IF st_intersects(geometry, secbox) THEN
secgeo := st_intersection(geometry, secbox);
IF NOT ST_IsEmpty(secgeo) AND ST_GeometryType(secgeo) in ('ST_Polygon','ST_MultiPolygon') THEN
FOR geo IN select quad_split_geometry(secgeo, maxarea, remainingdepth) as geom LOOP
IF NOT ST_IsEmpty(geo.geom) AND ST_GeometryType(geo.geom) in ('ST_Polygon','ST_MultiPolygon') THEN
added := added + 1;
RETURN NEXT geo.geom;
END IF;
END LOOP;
END IF;
secgeo := st_intersection(geometry, secbox);
IF NOT ST_IsEmpty(secgeo) AND ST_Dimension(secgeo) = 2 THEN
FOR geo IN SELECT quad_split_geometry(secgeo, maxarea, remainingdepth) as geom LOOP
IF NOT ST_IsEmpty(geo.geom) AND ST_Dimension(geo.geom) = 2 THEN
RETURN NEXT geo.geom;
END IF;
END LOOP;
END IF;
END LOOP;
RETURN;
END;
$$
LANGUAGE plpgsql IMMUTABLE;
LANGUAGE plpgsql IMMUTABLE PARALLEL SAFE;
CREATE OR REPLACE FUNCTION split_geometry(geometry GEOMETRY)
@@ -467,14 +480,26 @@ CREATE OR REPLACE FUNCTION split_geometry(geometry GEOMETRY)
DECLARE
geo RECORD;
BEGIN
-- 10000000000 is ~~ 1x1 degree
FOR geo IN select quad_split_geometry(geometry, 0.25, 20) as geom LOOP
RETURN NEXT geo.geom;
END LOOP;
IF ST_GeometryType(geometry) = 'ST_MultiPolygon'
and ST_Area(geometry) * 10 > ST_Area(Box2D(geometry))
THEN
FOR geo IN
SELECT quad_split_geometry(g, 0.25, 20) as geom
FROM (SELECT (ST_Dump(geometry)).geom::geometry(Polygon, 4326) AS g) xx
LOOP
RETURN NEXT geo.geom;
END LOOP;
ELSE
FOR geo IN
SELECT quad_split_geometry(geometry, 0.25, 20) as geom
LOOP
RETURN NEXT geo.geom;
END LOOP;
END IF;
RETURN;
END;
$$
LANGUAGE plpgsql IMMUTABLE;
LANGUAGE plpgsql IMMUTABLE PARALLEL SAFE;
CREATE OR REPLACE FUNCTION simplify_large_polygons(geometry GEOMETRY)
RETURNS GEOMETRY
@@ -488,7 +513,7 @@ BEGIN
RETURN geometry;
END;
$$
LANGUAGE plpgsql IMMUTABLE;
LANGUAGE plpgsql IMMUTABLE PARALLEL SAFE;
CREATE OR REPLACE FUNCTION place_force_delete(placeid BIGINT)
@@ -614,3 +639,34 @@ BEGIN
RETURN NULL;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE FUNCTION place_update_entrances(placeid BIGINT, osmid BIGINT)
RETURNS INTEGER
AS $$
DECLARE
entrance RECORD;
osm_ids BIGINT[];
BEGIN
osm_ids := '{}';
FOR entrance in SELECT osm_id, type, geometry, extratags
FROM place_entrance
WHERE osm_id IN (SELECT unnest(nodes) FROM planet_osm_ways WHERE id=osmid)
LOOP
osm_ids := array_append(osm_ids, entrance.osm_id);
INSERT INTO placex_entrance (place_id, osm_id, type, location, extratags)
VALUES (placeid, entrance.osm_id, entrance.type, entrance.geometry, entrance.extratags)
ON CONFLICT (place_id, osm_id) DO UPDATE
SET type = excluded.type, location = excluded.location, extratags = excluded.extratags;
END LOOP;
IF array_length(osm_ids, 1) > 0 THEN
DELETE FROM placex_entrance WHERE place_id=placeid AND NOT osm_id=ANY(osm_ids);
ELSE
DELETE FROM placex_entrance WHERE place_id=placeid;
END IF;
RETURN NULL;
END;
$$
LANGUAGE plpgsql;

View File

@@ -2,7 +2,7 @@
--
-- This file is part of Nominatim. (https://nominatim.org)
--
-- Copyright (C) 2022 by the Nominatim developer community.
-- Copyright (C) 2025 by the Nominatim developer community.
-- For a full list of authors see the git log.
-- Indices used only during search and update.
@@ -21,30 +21,25 @@ CREATE INDEX IF NOT EXISTS idx_placex_parent_place_id
ON placex USING BTREE (parent_place_id) {{db.tablespace.search_index}}
WHERE parent_place_id IS NOT NULL;
---
-- Used to find postcode areas after a search in location_postcode.
CREATE INDEX IF NOT EXISTS idx_placex_postcode_areas
ON placex USING BTREE (country_code, postcode) {{db.tablespace.search_index}}
WHERE osm_type = 'R' AND class = 'boundary' AND type = 'postal_code';
---
CREATE INDEX IF NOT EXISTS idx_placex_geometry ON placex
USING GIST (geometry) {{db.tablespace.search_index}};
---
-- Index is needed during import but can be dropped as soon as a full
-- geometry index is in place. The partial index is almost as big as the full
-- index.
---
DROP INDEX IF EXISTS idx_placex_geometry_lower_rank_ways;
---
CREATE INDEX IF NOT EXISTS idx_placex_geometry_reverse_lookupPolygon
ON placex USING gist (geometry) {{db.tablespace.search_index}}
WHERE St_GeometryType(geometry) in ('ST_Polygon', 'ST_MultiPolygon')
AND rank_address between 4 and 25 AND type != 'postcode'
AND rank_address between 4 and 25
AND name is not null AND indexed_status = 0 AND linked_place_id is null;
---
-- used in reverse large area lookup
CREATE INDEX IF NOT EXISTS idx_placex_geometry_reverse_lookupPlaceNode
ON placex USING gist (ST_Buffer(geometry, reverse_place_diameter(rank_search)))
{{db.tablespace.search_index}}
WHERE rank_address between 4 and 25 AND type != 'postcode'
WHERE rank_address between 4 and 25
AND name is not null AND linked_place_id is null AND osm_type = 'N';
---
CREATE INDEX IF NOT EXISTS idx_osmline_parent_place_id
@@ -53,9 +48,6 @@ CREATE INDEX IF NOT EXISTS idx_osmline_parent_place_id
---
CREATE INDEX IF NOT EXISTS idx_osmline_parent_osm_id
ON location_property_osmline USING BTREE (osm_id) {{db.tablespace.search_index}};
---
CREATE INDEX IF NOT EXISTS idx_postcode_postcode
ON location_postcode USING BTREE (postcode) {{db.tablespace.search_index}};
{% if drop %}
---
@@ -82,8 +74,8 @@ CREATE INDEX IF NOT EXISTS idx_postcode_postcode
deferred BOOLEAN
);
---
CREATE INDEX IF NOT EXISTS idx_location_postcode_parent_place_id
ON location_postcode USING BTREE (parent_place_id) {{db.tablespace.address_index}};
CREATE INDEX IF NOT EXISTS idx_location_postcodes_parent_place_id
ON location_postcodes USING BTREE (parent_place_id) {{db.tablespace.address_index}};
{% endif %}
-- Indices only needed for search.

View File

@@ -2,36 +2,48 @@
--
-- This file is part of Nominatim. (https://nominatim.org)
--
-- Copyright (C) 2022 by the Nominatim developer community.
-- Copyright (C) 2026 by the Nominatim developer community.
-- For a full list of authors see the git log.
drop table IF EXISTS search_name_blank CASCADE;
CREATE TABLE search_name_blank (
place_id BIGINT,
address_rank smallint,
name_vector integer[],
centroid GEOMETRY(Geometry, 4326)
place_id BIGINT NOT NULL,
address_rank smallint NOT NULL,
name_vector integer[] NOT NULL,
centroid GEOMETRY(Geometry, 4326) NOT NULL
);
{% for partition in db.partitions %}
CREATE TABLE location_area_large_{{ partition }} () INHERITS (location_area_large) {{db.tablespace.address_data}};
CREATE INDEX idx_location_area_large_{{ partition }}_place_id ON location_area_large_{{ partition }} USING BTREE (place_id) {{db.tablespace.address_index}};
CREATE INDEX idx_location_area_large_{{ partition }}_geometry ON location_area_large_{{ partition }} USING GIST (geometry) {{db.tablespace.address_index}};
CREATE INDEX idx_location_area_large_{{ partition }}_place_id
ON location_area_large_{{ partition }}
USING BTREE (place_id) {{db.tablespace.address_index}};
CREATE INDEX idx_location_area_large_{{ partition }}_geometry
ON location_area_large_{{ partition }}
USING GIST (geometry) {{db.tablespace.address_index}};
CREATE TABLE search_name_{{ partition }} () INHERITS (search_name_blank) {{db.tablespace.address_data}};
CREATE INDEX idx_search_name_{{ partition }}_place_id ON search_name_{{ partition }} USING BTREE (place_id) {{db.tablespace.address_index}};
CREATE INDEX idx_search_name_{{ partition }}_centroid_street ON search_name_{{ partition }} USING GIST (centroid) {{db.tablespace.address_index}} where address_rank between 26 and 27;
CREATE INDEX idx_search_name_{{ partition }}_centroid_place ON search_name_{{ partition }} USING GIST (centroid) {{db.tablespace.address_index}} where address_rank between 2 and 25;
CREATE UNIQUE INDEX idx_search_name_{{ partition }}_place_id
ON search_name_{{ partition }}
USING BTREE (place_id) {{db.tablespace.address_index}};
CREATE INDEX idx_search_name_{{ partition }}_centroid_street
ON search_name_{{ partition }} USING GIST (centroid) {{db.tablespace.address_index}}
WHERE address_rank between 26 and 27;
CREATE INDEX idx_search_name_{{ partition }}_centroid_place
ON search_name_{{ partition }} USING GIST (centroid) {{db.tablespace.address_index}}
WHERE address_rank between 2 and 25;
DROP TABLE IF EXISTS location_road_{{ partition }};
CREATE TABLE location_road_{{ partition }} (
place_id BIGINT,
partition SMALLINT,
place_id BIGINT NOT NULL,
partition SMALLINT NOT NULL,
country_code VARCHAR(2),
geometry GEOMETRY(Geometry, 4326)
geometry GEOMETRY(Geometry, 4326) NOT NULL
) {{db.tablespace.address_data}};
CREATE INDEX idx_location_road_{{ partition }}_geometry ON location_road_{{ partition }} USING GIST (geometry) {{db.tablespace.address_index}};
CREATE INDEX idx_location_road_{{ partition }}_place_id ON location_road_{{ partition }} USING BTREE (place_id) {{db.tablespace.address_index}};
CREATE INDEX idx_location_road_{{ partition }}_geometry
ON location_road_{{ partition }}
USING GIST (geometry) {{db.tablespace.address_index}};
CREATE UNIQUE INDEX idx_location_road_{{ partition }}_place_id
ON location_road_{{ partition }}
USING BTREE (place_id) {{db.tablespace.address_index}};
{% endfor %}

View File

@@ -2,7 +2,7 @@
--
-- This file is part of Nominatim. (https://nominatim.org)
--
-- Copyright (C) 2022 by the Nominatim developer community.
-- Copyright (C) 2025 by the Nominatim developer community.
-- For a full list of authors see the git log.
-- insert creates the location tables, creates location indexes if indexed == true
@@ -25,5 +25,9 @@ CREATE TRIGGER place_before_delete BEFORE DELETE ON place
CREATE TRIGGER place_before_insert BEFORE INSERT ON place
FOR EACH ROW EXECUTE PROCEDURE place_insert();
CREATE TRIGGER location_postcode_before_update BEFORE UPDATE ON location_postcode
FOR EACH ROW EXECUTE PROCEDURE postcode_update();
CREATE TRIGGER location_postcode_before_update BEFORE UPDATE ON location_postcodes
FOR EACH ROW EXECUTE PROCEDURE postcodes_update();
CREATE TRIGGER location_postcodes_before_delete BEFORE DELETE ON location_postcodes
FOR EACH ROW EXECUTE PROCEDURE postcodes_delete();
CREATE TRIGGER location_postcodes_before_insert BEFORE INSERT ON location_postcodes
FOR EACH ROW EXECUTE PROCEDURE postcodes_insert();

View File

@@ -2,7 +2,7 @@
--
-- This file is part of Nominatim. (https://nominatim.org)
--
-- Copyright (C) 2022 by the Nominatim developer community.
-- Copyright (C) 2026 by the Nominatim developer community.
-- For a full list of authors see the git log.
drop table if exists import_status;
@@ -23,24 +23,6 @@ CREATE TABLE import_osmosis_log (
event text
);
CREATE TABLE new_query_log (
type text,
starttime timestamp,
ipaddress text,
useragent text,
language text,
query text,
searchterm text,
endtime timestamp,
results integer,
format text,
secret text
);
CREATE INDEX idx_new_query_log_starttime ON new_query_log USING BTREE (starttime);
GRANT INSERT ON new_query_log TO "{{config.DATABASE_WEBUSER}}" ;
GRANT UPDATE ON new_query_log TO "{{config.DATABASE_WEBUSER}}" ;
GRANT SELECT ON new_query_log TO "{{config.DATABASE_WEBUSER}}" ;
GRANT SELECT ON TABLE country_name TO "{{config.DATABASE_WEBUSER}}";
DROP TABLE IF EXISTS nominatim_properties;
@@ -52,53 +34,53 @@ GRANT SELECT ON TABLE nominatim_properties TO "{{config.DATABASE_WEBUSER}}";
drop table IF EXISTS location_area CASCADE;
CREATE TABLE location_area (
place_id BIGINT,
keywords INTEGER[],
partition SMALLINT,
place_id BIGINT NOT NULL,
keywords INTEGER[] NOT NULL,
partition SMALLINT NOT NULL,
rank_search SMALLINT NOT NULL,
rank_address SMALLINT NOT NULL,
country_code VARCHAR(2),
isguess BOOL,
isguess BOOL NOT NULL,
postcode TEXT,
centroid GEOMETRY(Point, 4326),
geometry GEOMETRY(Geometry, 4326)
centroid GEOMETRY(Point, 4326) NOT NULL,
geometry GEOMETRY(Geometry, 4326) NOT NULL
);
CREATE TABLE location_area_large () INHERITS (location_area);
DROP TABLE IF EXISTS location_area_country;
CREATE TABLE location_area_country (
place_id BIGINT,
country_code varchar(2),
geometry GEOMETRY(Geometry, 4326)
place_id BIGINT NOT NULL,
country_code varchar(2) NOT NULL,
geometry GEOMETRY(Geometry, 4326) NOT NULL
) {{db.tablespace.address_data}};
CREATE INDEX idx_location_area_country_geometry ON location_area_country USING GIST (geometry) {{db.tablespace.address_index}};
CREATE TABLE location_property_tiger (
place_id BIGINT,
place_id BIGINT NOT NULL,
parent_place_id BIGINT,
startnumber INTEGER,
endnumber INTEGER,
step SMALLINT,
partition SMALLINT,
linegeo GEOMETRY,
startnumber INTEGER NOT NULL,
endnumber INTEGER NOT NULL,
step SMALLINT NOT NULL,
partition SMALLINT NOT NULL,
linegeo GEOMETRY NOT NULL,
postcode TEXT);
GRANT SELECT ON location_property_tiger TO "{{config.DATABASE_WEBUSER}}";
drop table if exists location_property_osmline;
CREATE TABLE location_property_osmline (
place_id BIGINT NOT NULL,
osm_id BIGINT,
osm_id BIGINT NOT NULL,
parent_place_id BIGINT,
geometry_sector INTEGER,
geometry_sector INTEGER NOT NULL,
indexed_date TIMESTAMP,
startnumber INTEGER,
endnumber INTEGER,
step SMALLINT,
partition SMALLINT,
indexed_status SMALLINT,
linegeo GEOMETRY,
partition SMALLINT NOT NULL,
indexed_status SMALLINT NOT NULL,
linegeo GEOMETRY NOT NULL,
address HSTORE,
token_info JSONB, -- custom column for tokenizer use only
postcode TEXT,
@@ -113,27 +95,28 @@ GRANT SELECT ON location_property_osmline TO "{{config.DATABASE_WEBUSER}}";
drop table IF EXISTS search_name;
{% if not db.reverse_only %}
CREATE TABLE search_name (
place_id BIGINT,
importance FLOAT,
search_rank SMALLINT,
address_rank SMALLINT,
name_vector integer[],
nameaddress_vector integer[],
place_id BIGINT NOT NULL,
importance FLOAT NOT NULL,
search_rank SMALLINT NOT NULL,
address_rank SMALLINT NOT NULL,
name_vector integer[] NOT NULL,
nameaddress_vector integer[] NOT NULL,
country_code varchar(2),
centroid GEOMETRY(Geometry, 4326)
centroid GEOMETRY(Geometry, 4326) NOT NULL
) {{db.tablespace.search_data}};
CREATE INDEX idx_search_name_place_id ON search_name USING BTREE (place_id) {{db.tablespace.search_index}};
CREATE UNIQUE INDEX idx_search_name_place_id
ON search_name USING BTREE (place_id) {{db.tablespace.search_index}};
GRANT SELECT ON search_name to "{{config.DATABASE_WEBUSER}}" ;
{% endif %}
drop table IF EXISTS place_addressline;
CREATE TABLE place_addressline (
place_id BIGINT,
address_place_id BIGINT,
distance FLOAT,
cached_rank_address SMALLINT,
fromarea boolean,
isaddress boolean
place_id BIGINT NOT NULL,
address_place_id BIGINT NOT NULL,
distance FLOAT NOT NULL,
cached_rank_address SMALLINT NOT NULL,
fromarea boolean NOT NULL,
isaddress boolean NOT NULL
) {{db.tablespace.search_data}};
CREATE INDEX idx_place_addressline_place_id on place_addressline USING BTREE (place_id) {{db.tablespace.search_index}};
@@ -146,18 +129,18 @@ CREATE TABLE placex (
linked_place_id BIGINT,
importance FLOAT,
indexed_date TIMESTAMP,
geometry_sector INTEGER,
rank_address SMALLINT,
rank_search SMALLINT,
partition SMALLINT,
indexed_status SMALLINT,
geometry_sector INTEGER NOT NULL,
rank_address SMALLINT NOT NULL,
rank_search SMALLINT NOT NULL,
partition SMALLINT NOT NULL,
indexed_status SMALLINT NOT NULL,
LIKE place INCLUDING CONSTRAINTS,
wikipedia TEXT, -- calculated wikipedia article name (language:title)
token_info JSONB, -- custom column for tokenizer use only
country_code varchar(2),
housenumber TEXT,
postcode TEXT,
centroid GEOMETRY(Geometry, 4326)
centroid GEOMETRY(Geometry, 4326) NOT NULL
) {{db.tablespace.search_data}};
CREATE UNIQUE INDEX idx_place_id ON placex USING BTREE (place_id) {{db.tablespace.search_index}};
@@ -192,8 +175,7 @@ CREATE INDEX idx_placex_geometry_buildings ON placex
-- - linking of place nodes with same type to boundaries
CREATE INDEX idx_placex_geometry_placenode ON placex
USING SPGIST (geometry) {{db.tablespace.address_index}}
WHERE osm_type = 'N' and rank_search < 26
and class = 'place' and type != 'postcode';
WHERE osm_type = 'N' and rank_search < 26 and class = 'place';
-- Usage: - is node part of a way?
-- - find parent of interpolation spatially
@@ -228,21 +210,48 @@ GRANT SELECT ON planet_osm_rels to "{{config.DATABASE_WEBUSER}}" ;
GRANT SELECT on location_area to "{{config.DATABASE_WEBUSER}}" ;
-- Table for synthetic postcodes.
DROP TABLE IF EXISTS location_postcode;
CREATE TABLE location_postcode (
place_id BIGINT,
DROP TABLE IF EXISTS location_postcodes;
CREATE TABLE location_postcodes (
place_id BIGINT NOT NULL,
parent_place_id BIGINT,
rank_search SMALLINT,
rank_address SMALLINT,
indexed_status SMALLINT,
osm_id BIGINT,
rank_search SMALLINT NOT NULL,
indexed_status SMALLINT NOT NULL,
indexed_date TIMESTAMP,
country_code varchar(2),
postcode TEXT,
geometry GEOMETRY(Geometry, 4326)
country_code varchar(2) NOT NULL,
postcode TEXT NOT NULL,
centroid GEOMETRY(Geometry, 4326) NOT NULL,
geometry GEOMETRY(Geometry, 4326) NOT NULL
);
CREATE UNIQUE INDEX idx_postcode_id ON location_postcode USING BTREE (place_id) {{db.tablespace.search_index}};
CREATE INDEX idx_postcode_geometry ON location_postcode USING GIST (geometry) {{db.tablespace.address_index}};
GRANT SELECT ON location_postcode TO "{{config.DATABASE_WEBUSER}}" ;
CREATE UNIQUE INDEX idx_location_postcodes_id ON location_postcodes
USING BTREE (place_id) {{db.tablespace.search_index}};
CREATE INDEX idx_location_postcodes_geometry ON location_postcodes
USING GIST (geometry) {{db.tablespace.search_index}};
CREATE INDEX IF NOT EXISTS idx_location_postcodes_postcode
ON location_postcodes USING BTREE (postcode, country_code)
{{db.tablespace.search_index}};
CREATE INDEX IF NOT EXISTS idx_location_postcodes_osmid
ON location_postcodes USING BTREE (osm_id) {{db.tablespace.search_index}};
GRANT SELECT ON location_postcodes TO "{{config.DATABASE_WEBUSER}}" ;
-- Table to store location of entrance nodes
DROP TABLE IF EXISTS placex_entrance;
CREATE TABLE placex_entrance (
place_id BIGINT NOT NULL,
osm_id BIGINT NOT NULL,
type TEXT NOT NULL,
location GEOMETRY(Point, 4326) NOT NULL,
extratags HSTORE
);
CREATE UNIQUE INDEX idx_placex_entrance_place_id_osm_id ON placex_entrance
USING BTREE (place_id, osm_id) {{db.tablespace.search_index}};
GRANT SELECT ON placex_entrance TO "{{config.DATABASE_WEBUSER}}" ;
-- Create an index on the place table for lookups to populate the entrance
-- table
CREATE INDEX IF NOT EXISTS idx_placex_entrance_lookup ON place
USING BTREE (osm_id)
WHERE class IN ('routing:entrance', 'entrance');
DROP TABLE IF EXISTS import_polygon_error;
CREATE TABLE import_polygon_error (

View File

@@ -15,6 +15,99 @@ CREATE TABLE location_property_tiger_import (
step SMALLINT,
postcode TEXT);
-- Lookup functions for tiger import when update
-- informations are dropped (see gh-issue #2463)
CREATE OR REPLACE FUNCTION getNearestNamedRoadPlaceIdSlow(in_centroid GEOMETRY,
in_token_info JSONB)
RETURNS BIGINT
AS $$
DECLARE
out_place_id BIGINT;
BEGIN
SELECT place_id INTO out_place_id
FROM search_name
WHERE
-- finds rows where name_vector shares elements with search tokens.
token_matches_street(in_token_info, name_vector)
-- limits search area
AND centroid && ST_Expand(in_centroid, 0.015)
AND address_rank BETWEEN 26 AND 27
ORDER BY ST_Distance(centroid, in_centroid) ASC
LIMIT 1;
RETURN out_place_id;
END
$$
LANGUAGE plpgsql;
CREATE OR REPLACE FUNCTION getNearestParallelRoadFeatureSlow(line GEOMETRY)
RETURNS BIGINT
AS $$
DECLARE
r RECORD;
search_diameter FLOAT;
p1 GEOMETRY;
p2 GEOMETRY;
p3 GEOMETRY;
BEGIN
IF ST_GeometryType(line) not in ('ST_LineString') THEN
RETURN NULL;
END IF;
p1 := ST_LineInterpolatePoint(line,0);
p2 := ST_LineInterpolatePoint(line,0.5);
p3 := ST_LineInterpolatePoint(line,1);
search_diameter := 0.0005;
WHILE search_diameter < 0.01 LOOP
FOR r IN
SELECT place_id FROM placex
WHERE ST_DWithin(line, geometry, search_diameter)
AND rank_address BETWEEN 26 AND 27
ORDER BY (ST_distance(geometry, p1)+
ST_distance(geometry, p2)+
ST_distance(geometry, p3)) ASC limit 1
LOOP
RETURN r.place_id;
END LOOP;
search_diameter := search_diameter * 2;
END LOOP;
RETURN NULL;
END
$$
LANGUAGE plpgsql;
CREATE OR REPLACE FUNCTION getNearestRoadPlaceIdSlow(point GEOMETRY)
RETURNS BIGINT
AS $$
DECLARE
r RECORD;
search_diameter FLOAT;
BEGIN
search_diameter := 0.00005;
WHILE search_diameter < 0.1 LOOP
FOR r IN
SELECT place_id FROM placex
WHERE ST_DWithin(geometry, point, search_diameter)
AND rank_address BETWEEN 26 AND 27
ORDER BY ST_Distance(geometry, point) ASC limit 1
LOOP
RETURN r.place_id;
END LOOP;
search_diameter := search_diameter * 2;
END LOOP;
RETURN NULL;
END
$$
LANGUAGE plpgsql;
-- Tiger import function
CREATE OR REPLACE FUNCTION tiger_line_import(linegeo GEOMETRY, in_startnumber INTEGER,
in_endnumber INTEGER, interpolationtype TEXT,
token_info JSONB, in_postcode TEXT) RETURNS INTEGER
@@ -71,28 +164,51 @@ BEGIN
place_centroid := ST_Centroid(linegeo);
out_partition := get_partition('us');
out_parent_place_id := getNearestNamedRoadPlaceId(out_partition, place_centroid,
-- HYBRID LOOKUP LOGIC (see gh-issue #2463)
-- if partition tables exist, use them for fast spatial lookups
{% if 'location_road_0' in db.tables %}
out_parent_place_id := getNearestNamedRoadPlaceId(out_partition, place_centroid,
token_info);
IF out_parent_place_id IS NULL THEN
SELECT getNearestParallelRoadFeature(out_partition, linegeo)
INTO out_parent_place_id;
IF out_parent_place_id IS NULL THEN
SELECT getNearestParallelRoadFeature(out_partition, linegeo)
INTO out_parent_place_id;
END IF;
IF out_parent_place_id IS NULL THEN
SELECT getNearestRoadPlaceId(out_partition, place_centroid)
INTO out_parent_place_id;
END IF;
-- When updatable information has been dropped:
-- Partition tables no longer exist, but search_name still persists.
{% elif 'search_name' in db.tables %}
-- Fallback: Look up in 'search_name' table
-- though spatial lookups here can be slower.
out_parent_place_id := getNearestNamedRoadPlaceIdSlow(place_centroid, token_info);
IF out_parent_place_id IS NULL THEN
out_parent_place_id := getNearestParallelRoadFeatureSlow(linegeo);
END IF;
IF out_parent_place_id IS NULL THEN
out_parent_place_id := getNearestRoadPlaceIdSlow(place_centroid);
END IF;
{% endif %}
-- If parent was found, insert street(line) into import table
IF out_parent_place_id IS NOT NULL THEN
INSERT INTO location_property_tiger_import (linegeo, place_id, partition,
parent_place_id, startnumber, endnumber,
step, postcode)
VALUES (linegeo, nextval('seq_place'), out_partition,
out_parent_place_id, startnumber, endnumber,
stepsize, in_postcode);
RETURN 1;
END IF;
RETURN 0;
IF out_parent_place_id IS NULL THEN
SELECT getNearestRoadPlaceId(out_partition, place_centroid)
INTO out_parent_place_id;
END IF;
--insert street(line) into import table
insert into location_property_tiger_import (linegeo, place_id, partition,
parent_place_id, startnumber, endnumber,
step, postcode)
values (linegeo, nextval('seq_place'), out_partition,
out_parent_place_id, startnumber, endnumber,
stepsize, in_postcode);
RETURN 1;
END;
$$
LANGUAGE plpgsql;

View File

@@ -12,7 +12,7 @@ CREATE OR REPLACE FUNCTION token_get_name_search_tokens(info JSONB)
RETURNS INTEGER[]
AS $$
SELECT (info->>'names')::INTEGER[]
$$ LANGUAGE SQL IMMUTABLE STRICT;
$$ LANGUAGE SQL IMMUTABLE STRICT PARALLEL SAFE;
-- Get tokens for matching the place name against others.
@@ -22,7 +22,7 @@ CREATE OR REPLACE FUNCTION token_get_name_match_tokens(info JSONB)
RETURNS INTEGER[]
AS $$
SELECT (info->>'names')::INTEGER[]
$$ LANGUAGE SQL IMMUTABLE STRICT;
$$ LANGUAGE SQL IMMUTABLE STRICT PARALLEL SAFE;
-- Return the housenumber tokens applicable for the place.
@@ -30,7 +30,7 @@ CREATE OR REPLACE FUNCTION token_get_housenumber_search_tokens(info JSONB)
RETURNS INTEGER[]
AS $$
SELECT (info->>'hnr_tokens')::INTEGER[]
$$ LANGUAGE SQL IMMUTABLE STRICT;
$$ LANGUAGE SQL IMMUTABLE STRICT PARALLEL SAFE;
-- Return the housenumber in the form that it can be matched during search.
@@ -38,77 +38,77 @@ CREATE OR REPLACE FUNCTION token_normalized_housenumber(info JSONB)
RETURNS TEXT
AS $$
SELECT info->>'hnr';
$$ LANGUAGE SQL IMMUTABLE STRICT;
$$ LANGUAGE SQL IMMUTABLE STRICT PARALLEL SAFE;
CREATE OR REPLACE FUNCTION token_is_street_address(info JSONB)
RETURNS BOOLEAN
AS $$
SELECT info->>'street' is not null or info->>'place' is null;
$$ LANGUAGE SQL IMMUTABLE;
$$ LANGUAGE SQL IMMUTABLE PARALLEL SAFE;
CREATE OR REPLACE FUNCTION token_has_addr_street(info JSONB)
RETURNS BOOLEAN
AS $$
SELECT info->>'street' is not null and info->>'street' != '{}';
$$ LANGUAGE SQL IMMUTABLE;
$$ LANGUAGE SQL IMMUTABLE PARALLEL SAFE;
CREATE OR REPLACE FUNCTION token_has_addr_place(info JSONB)
RETURNS BOOLEAN
AS $$
SELECT info->>'place' is not null;
$$ LANGUAGE SQL IMMUTABLE;
$$ LANGUAGE SQL IMMUTABLE PARALLEL SAFE;
CREATE OR REPLACE FUNCTION token_matches_street(info JSONB, street_tokens INTEGER[])
RETURNS BOOLEAN
AS $$
SELECT (info->>'street')::INTEGER[] && street_tokens
$$ LANGUAGE SQL IMMUTABLE STRICT;
$$ LANGUAGE SQL IMMUTABLE STRICT PARALLEL SAFE;
CREATE OR REPLACE FUNCTION token_matches_place(info JSONB, place_tokens INTEGER[])
RETURNS BOOLEAN
AS $$
SELECT (info->>'place')::INTEGER[] <@ place_tokens
$$ LANGUAGE SQL IMMUTABLE STRICT;
$$ LANGUAGE SQL IMMUTABLE STRICT PARALLEL SAFE;
CREATE OR REPLACE FUNCTION token_addr_place_search_tokens(info JSONB)
RETURNS INTEGER[]
AS $$
SELECT (info->>'place')::INTEGER[]
$$ LANGUAGE SQL IMMUTABLE STRICT;
$$ LANGUAGE SQL IMMUTABLE STRICT PARALLEL SAFE;
CREATE OR REPLACE FUNCTION token_get_address_keys(info JSONB)
RETURNS SETOF TEXT
AS $$
SELECT * FROM jsonb_object_keys(info->'addr');
$$ LANGUAGE SQL IMMUTABLE STRICT;
$$ LANGUAGE SQL IMMUTABLE STRICT PARALLEL SAFE;
CREATE OR REPLACE FUNCTION token_get_address_search_tokens(info JSONB, key TEXT)
RETURNS INTEGER[]
AS $$
SELECT (info->'addr'->>key)::INTEGER[];
$$ LANGUAGE SQL IMMUTABLE STRICT;
$$ LANGUAGE SQL IMMUTABLE STRICT PARALLEL SAFE;
CREATE OR REPLACE FUNCTION token_matches_address(info JSONB, key TEXT, tokens INTEGER[])
RETURNS BOOLEAN
AS $$
SELECT (info->'addr'->>key)::INTEGER[] <@ tokens;
$$ LANGUAGE SQL IMMUTABLE STRICT;
$$ LANGUAGE SQL IMMUTABLE STRICT PARALLEL SAFE;
CREATE OR REPLACE FUNCTION token_get_postcode(info JSONB)
RETURNS TEXT
AS $$
SELECT info->>'postcode';
$$ LANGUAGE SQL IMMUTABLE STRICT;
$$ LANGUAGE SQL IMMUTABLE STRICT PARALLEL SAFE;
-- Return token info that should be saved permanently in the database.
@@ -116,7 +116,7 @@ CREATE OR REPLACE FUNCTION token_strip_info(info JSONB)
RETURNS JSONB
AS $$
SELECT NULL::JSONB;
$$ LANGUAGE SQL IMMUTABLE STRICT;
$$ LANGUAGE SQL IMMUTABLE STRICT PARALLEL SAFE;
--------------- private functions ----------------------------------------------
@@ -128,16 +128,14 @@ DECLARE
partial_terms TEXT[] = '{}'::TEXT[];
term TEXT;
term_id INTEGER;
term_count INTEGER;
BEGIN
SELECT min(word_id) INTO full_token
FROM word WHERE word = norm_term and type = 'W';
IF full_token IS NULL THEN
full_token := nextval('seq_word');
INSERT INTO word (word_id, word_token, type, word, info)
SELECT full_token, lookup_term, 'W', norm_term,
json_build_object('count', 0)
INSERT INTO word (word_id, word_token, type, word)
SELECT full_token, lookup_term, 'W', norm_term
FROM unnest(lookup_terms) as lookup_term;
END IF;
@@ -150,14 +148,67 @@ BEGIN
partial_tokens := '{}'::INT[];
FOR term IN SELECT unnest(partial_terms) LOOP
SELECT min(word_id), max(info->>'count') INTO term_id, term_count
SELECT min(word_id) INTO term_id
FROM word WHERE word_token = term and type = 'w';
IF term_id IS NULL THEN
term_id := nextval('seq_word');
term_count := 0;
INSERT INTO word (word_id, word_token, type, info)
VALUES (term_id, term, 'w', json_build_object('count', term_count));
INSERT INTO word (word_id, word_token, type)
VALUES (term_id, term, 'w');
END IF;
partial_tokens := array_merge(partial_tokens, ARRAY[term_id]);
END LOOP;
END;
$$
LANGUAGE plpgsql;
CREATE OR REPLACE FUNCTION getorcreate_full_word(norm_term TEXT,
lookup_terms TEXT[],
lookup_norm_terms TEXT[],
OUT full_token INT,
OUT partial_tokens INT[])
AS $$
DECLARE
partial_terms TEXT[] = '{}'::TEXT[];
term TEXT;
term_id INTEGER;
BEGIN
SELECT min(word_id) INTO full_token
FROM word WHERE word = norm_term and type = 'W';
IF full_token IS NULL THEN
full_token := nextval('seq_word');
IF lookup_norm_terms IS NULL THEN
INSERT INTO word (word_id, word_token, type, word)
SELECT full_token, lookup_term, 'W', norm_term
FROM unnest(lookup_terms) as lookup_term;
ELSE
INSERT INTO word (word_id, word_token, type, word, info)
SELECT full_token, t.lookup, 'W', norm_term,
CASE WHEN norm_term = t.norm THEN null
ELSE json_build_object('lookup', t.norm) END
FROM unnest(lookup_terms, lookup_norm_terms) as t(lookup, norm);
END IF;
END IF;
FOR term IN SELECT unnest(string_to_array(unnest(lookup_terms), ' ')) LOOP
term := trim(term);
IF NOT (ARRAY[term] <@ partial_terms) THEN
partial_terms := partial_terms || term;
END IF;
END LOOP;
partial_tokens := '{}'::INT[];
FOR term IN SELECT unnest(partial_terms) LOOP
SELECT min(word_id) INTO term_id
FROM word WHERE word_token = term and type = 'w';
IF term_id IS NULL THEN
term_id := nextval('seq_word');
INSERT INTO word (word_id, word_token, type)
VALUES (term_id, term, 'w');
END IF;
partial_tokens := array_merge(partial_tokens, ARRAY[term_id]);

View File

@@ -67,6 +67,7 @@ markdown_extensions:
- codehilite
- admonition
- pymdownx.superfences
- pymdownx.blocks.html
- pymdownx.tabbed:
alternate_style: true
- def_list

View File

@@ -3,7 +3,7 @@
#
# This file is part of Nominatim. (https://nominatim.org)
#
# Copyright (C) 2024 by the Nominatim developer community.
# Copyright (C) 2025 by the Nominatim developer community.
# For a full list of authors see the git log.
"""
Helper script for development to run nominatim from the source directory.
@@ -15,4 +15,4 @@ sys.path.insert(1, str((Path(__file__) / '..' / 'src').resolve()))
from nominatim_db import cli
exit(cli.nominatim(module_dir=None, osm2pgsql_path=None))
exit(cli.nominatim())

View File

@@ -1 +0,0 @@
../../COPYING

View File

@@ -0,0 +1,232 @@
GNU GENERAL PUBLIC LICENSE
Version 3, 29 June 2007
Copyright © 2007 Free Software Foundation, Inc. <https://fsf.org/>
Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed.
Preamble
The GNU General Public License is a free, copyleft license for software and other kinds of works.
The licenses for most software and other practical works are designed to take away your freedom to share and change the works. By contrast, the GNU General Public License is intended to guarantee your freedom to share and change all versions of a program--to make sure it remains free software for all its users. We, the Free Software Foundation, use the GNU General Public License for most of our software; it applies also to any other work released this way by its authors. You can apply it to your programs, too.
When we speak of free software, we are referring to freedom, not price. Our General Public Licenses are designed to make sure that you have the freedom to distribute copies of free software (and charge for them if you wish), that you receive source code or can get it if you want it, that you can change the software or use pieces of it in new free programs, and that you know you can do these things.
To protect your rights, we need to prevent others from denying you these rights or asking you to surrender the rights. Therefore, you have certain responsibilities if you distribute copies of the software, or if you modify it: responsibilities to respect the freedom of others.
For example, if you distribute copies of such a program, whether gratis or for a fee, you must pass on to the recipients the same freedoms that you received. You must make sure that they, too, receive or can get the source code. And you must show them these terms so they know their rights.
Developers that use the GNU GPL protect your rights with two steps: (1) assert copyright on the software, and (2) offer you this License giving you legal permission to copy, distribute and/or modify it.
For the developers' and authors' protection, the GPL clearly explains that there is no warranty for this free software. For both users' and authors' sake, the GPL requires that modified versions be marked as changed, so that their problems will not be attributed erroneously to authors of previous versions.
Some devices are designed to deny users access to install or run modified versions of the software inside them, although the manufacturer can do so. This is fundamentally incompatible with the aim of protecting users' freedom to change the software. The systematic pattern of such abuse occurs in the area of products for individuals to use, which is precisely where it is most unacceptable. Therefore, we have designed this version of the GPL to prohibit the practice for those products. If such problems arise substantially in other domains, we stand ready to extend this provision to those domains in future versions of the GPL, as needed to protect the freedom of users.
Finally, every program is threatened constantly by software patents. States should not allow patents to restrict development and use of software on general-purpose computers, but in those that do, we wish to avoid the special danger that patents applied to a free program could make it effectively proprietary. To prevent this, the GPL assures that patents cannot be used to render the program non-free.
The precise terms and conditions for copying, distribution and modification follow.
TERMS AND CONDITIONS
0. Definitions.
“This License” refers to version 3 of the GNU General Public License.
“Copyright” also means copyright-like laws that apply to other kinds of works, such as semiconductor masks.
“The Program” refers to any copyrightable work licensed under this License. Each licensee is addressed as “you”. “Licensees” and “recipients” may be individuals or organizations.
To “modify” a work means to copy from or adapt all or part of the work in a fashion requiring copyright permission, other than the making of an exact copy. The resulting work is called a “modified version” of the earlier work or a work “based on” the earlier work.
A “covered work” means either the unmodified Program or a work based on the Program.
To “propagate” a work means to do anything with it that, without permission, would make you directly or secondarily liable for infringement under applicable copyright law, except executing it on a computer or modifying a private copy. Propagation includes copying, distribution (with or without modification), making available to the public, and in some countries other activities as well.
To “convey” a work means any kind of propagation that enables other parties to make or receive copies. Mere interaction with a user through a computer network, with no transfer of a copy, is not conveying.
An interactive user interface displays “Appropriate Legal Notices” to the extent that it includes a convenient and prominently visible feature that (1) displays an appropriate copyright notice, and (2) tells the user that there is no warranty for the work (except to the extent that warranties are provided), that licensees may convey the work under this License, and how to view a copy of this License. If the interface presents a list of user commands or options, such as a menu, a prominent item in the list meets this criterion.
1. Source Code.
The “source code” for a work means the preferred form of the work for making modifications to it. “Object code” means any non-source form of a work.
A “Standard Interface” means an interface that either is an official standard defined by a recognized standards body, or, in the case of interfaces specified for a particular programming language, one that is widely used among developers working in that language.
The “System Libraries” of an executable work include anything, other than the work as a whole, that (a) is included in the normal form of packaging a Major Component, but which is not part of that Major Component, and (b) serves only to enable use of the work with that Major Component, or to implement a Standard Interface for which an implementation is available to the public in source code form. A “Major Component”, in this context, means a major essential component (kernel, window system, and so on) of the specific operating system (if any) on which the executable work runs, or a compiler used to produce the work, or an object code interpreter used to run it.
The “Corresponding Source” for a work in object code form means all the source code needed to generate, install, and (for an executable work) run the object code and to modify the work, including scripts to control those activities. However, it does not include the work's System Libraries, or general-purpose tools or generally available free programs which are used unmodified in performing those activities but which are not part of the work. For example, Corresponding Source includes interface definition files associated with source files for the work, and the source code for shared libraries and dynamically linked subprograms that the work is specifically designed to require, such as by intimate data communication or control flow between those subprograms and other parts of the work.
The Corresponding Source need not include anything that users can regenerate automatically from other parts of the Corresponding Source.
The Corresponding Source for a work in source code form is that same work.
2. Basic Permissions.
All rights granted under this License are granted for the term of copyright on the Program, and are irrevocable provided the stated conditions are met. This License explicitly affirms your unlimited permission to run the unmodified Program. The output from running a covered work is covered by this License only if the output, given its content, constitutes a covered work. This License acknowledges your rights of fair use or other equivalent, as provided by copyright law.
You may make, run and propagate covered works that you do not convey, without conditions so long as your license otherwise remains in force. You may convey covered works to others for the sole purpose of having them make modifications exclusively for you, or provide you with facilities for running those works, provided that you comply with the terms of this License in conveying all material for which you do not control copyright. Those thus making or running the covered works for you must do so exclusively on your behalf, under your direction and control, on terms that prohibit them from making any copies of your copyrighted material outside their relationship with you.
Conveying under any other circumstances is permitted solely under the conditions stated below. Sublicensing is not allowed; section 10 makes it unnecessary.
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
No covered work shall be deemed part of an effective technological measure under any applicable law fulfilling obligations under article 11 of the WIPO copyright treaty adopted on 20 December 1996, or similar laws prohibiting or restricting circumvention of such measures.
When you convey a covered work, you waive any legal power to forbid circumvention of technological measures to the extent such circumvention is effected by exercising rights under this License with respect to the covered work, and you disclaim any intention to limit operation or modification of the work as a means of enforcing, against the work's users, your or third parties' legal rights to forbid circumvention of technological measures.
4. Conveying Verbatim Copies.
You may convey verbatim copies of the Program's source code as you receive it, in any medium, provided that you conspicuously and appropriately publish on each copy an appropriate copyright notice; keep intact all notices stating that this License and any non-permissive terms added in accord with section 7 apply to the code; keep intact all notices of the absence of any warranty; and give all recipients a copy of this License along with the Program.
You may charge any price or no price for each copy that you convey, and you may offer support or warranty protection for a fee.
5. Conveying Modified Source Versions.
You may convey a work based on the Program, or the modifications to produce it from the Program, in the form of source code under the terms of section 4, provided that you also meet all of these conditions:
a) The work must carry prominent notices stating that you modified it, and giving a relevant date.
b) The work must carry prominent notices stating that it is released under this License and any conditions added under section 7. This requirement modifies the requirement in section 4 to “keep intact all notices”.
c) You must license the entire work, as a whole, under this License to anyone who comes into possession of a copy. This License will therefore apply, along with any applicable section 7 additional terms, to the whole of the work, and all its parts, regardless of how they are packaged. This License gives no permission to license the work in any other way, but it does not invalidate such permission if you have separately received it.
d) If the work has interactive user interfaces, each must display Appropriate Legal Notices; however, if the Program has interactive interfaces that do not display Appropriate Legal Notices, your work need not make them do so.
A compilation of a covered work with other separate and independent works, which are not by their nature extensions of the covered work, and which are not combined with it such as to form a larger program, in or on a volume of a storage or distribution medium, is called an “aggregate” if the compilation and its resulting copyright are not used to limit the access or legal rights of the compilation's users beyond what the individual works permit. Inclusion of a covered work in an aggregate does not cause this License to apply to the other parts of the aggregate.
6. Conveying Non-Source Forms.
You may convey a covered work in object code form under the terms of sections 4 and 5, provided that you also convey the machine-readable Corresponding Source under the terms of this License, in one of these ways:
a) Convey the object code in, or embodied in, a physical product (including a physical distribution medium), accompanied by the Corresponding Source fixed on a durable physical medium customarily used for software interchange.
b) Convey the object code in, or embodied in, a physical product (including a physical distribution medium), accompanied by a written offer, valid for at least three years and valid for as long as you offer spare parts or customer support for that product model, to give anyone who possesses the object code either (1) a copy of the Corresponding Source for all the software in the product that is covered by this License, on a durable physical medium customarily used for software interchange, for a price no more than your reasonable cost of physically performing this conveying of source, or (2) access to copy the Corresponding Source from a network server at no charge.
c) Convey individual copies of the object code with a copy of the written offer to provide the Corresponding Source. This alternative is allowed only occasionally and noncommercially, and only if you received the object code with such an offer, in accord with subsection 6b.
d) Convey the object code by offering access from a designated place (gratis or for a charge), and offer equivalent access to the Corresponding Source in the same way through the same place at no further charge. You need not require recipients to copy the Corresponding Source along with the object code. If the place to copy the object code is a network server, the Corresponding Source may be on a different server (operated by you or a third party) that supports equivalent copying facilities, provided you maintain clear directions next to the object code saying where to find the Corresponding Source. Regardless of what server hosts the Corresponding Source, you remain obligated to ensure that it is available for as long as needed to satisfy these requirements.
e) Convey the object code using peer-to-peer transmission, provided you inform other peers where the object code and Corresponding Source of the work are being offered to the general public at no charge under subsection 6d.
A separable portion of the object code, whose source code is excluded from the Corresponding Source as a System Library, need not be included in conveying the object code work.
A “User Product” is either (1) a “consumer product”, which means any tangible personal property which is normally used for personal, family, or household purposes, or (2) anything designed or sold for incorporation into a dwelling. In determining whether a product is a consumer product, doubtful cases shall be resolved in favor of coverage. For a particular product received by a particular user, “normally used” refers to a typical or common use of that class of product, regardless of the status of the particular user or of the way in which the particular user actually uses, or expects or is expected to use, the product. A product is a consumer product regardless of whether the product has substantial commercial, industrial or non-consumer uses, unless such uses represent the only significant mode of use of the product.
“Installation Information” for a User Product means any methods, procedures, authorization keys, or other information required to install and execute modified versions of a covered work in that User Product from a modified version of its Corresponding Source. The information must suffice to ensure that the continued functioning of the modified object code is in no case prevented or interfered with solely because modification has been made.
If you convey an object code work under this section in, or with, or specifically for use in, a User Product, and the conveying occurs as part of a transaction in which the right of possession and use of the User Product is transferred to the recipient in perpetuity or for a fixed term (regardless of how the transaction is characterized), the Corresponding Source conveyed under this section must be accompanied by the Installation Information. But this requirement does not apply if neither you nor any third party retains the ability to install modified object code on the User Product (for example, the work has been installed in ROM).
The requirement to provide Installation Information does not include a requirement to continue to provide support service, warranty, or updates for a work that has been modified or installed by the recipient, or for the User Product in which it has been modified or installed. Access to a network may be denied when the modification itself materially and adversely affects the operation of the network or violates the rules and protocols for communication across the network.
Corresponding Source conveyed, and Installation Information provided, in accord with this section must be in a format that is publicly documented (and with an implementation available to the public in source code form), and must require no special password or key for unpacking, reading or copying.
7. Additional Terms.
“Additional permissions” are terms that supplement the terms of this License by making exceptions from one or more of its conditions. Additional permissions that are applicable to the entire Program shall be treated as though they were included in this License, to the extent that they are valid under applicable law. If additional permissions apply only to part of the Program, that part may be used separately under those permissions, but the entire Program remains governed by this License without regard to the additional permissions.
When you convey a copy of a covered work, you may at your option remove any additional permissions from that copy, or from any part of it. (Additional permissions may be written to require their own removal in certain cases when you modify the work.) You may place additional permissions on material, added by you to a covered work, for which you have or can give appropriate copyright permission.
Notwithstanding any other provision of this License, for material you add to a covered work, you may (if authorized by the copyright holders of that material) supplement the terms of this License with terms:
a) Disclaiming warranty or limiting liability differently from the terms of sections 15 and 16 of this License; or
b) Requiring preservation of specified reasonable legal notices or author attributions in that material or in the Appropriate Legal Notices displayed by works containing it; or
c) Prohibiting misrepresentation of the origin of that material, or requiring that modified versions of such material be marked in reasonable ways as different from the original version; or
d) Limiting the use for publicity purposes of names of licensors or authors of the material; or
e) Declining to grant rights under trademark law for use of some trade names, trademarks, or service marks; or
f) Requiring indemnification of licensors and authors of that material by anyone who conveys the material (or modified versions of it) with contractual assumptions of liability to the recipient, for any liability that these contractual assumptions directly impose on those licensors and authors.
All other non-permissive additional terms are considered “further restrictions” within the meaning of section 10. If the Program as you received it, or any part of it, contains a notice stating that it is governed by this License along with a term that is a further restriction, you may remove that term. If a license document contains a further restriction but permits relicensing or conveying under this License, you may add to a covered work material governed by the terms of that license document, provided that the further restriction does not survive such relicensing or conveying.
If you add terms to a covered work in accord with this section, you must place, in the relevant source files, a statement of the additional terms that apply to those files, or a notice indicating where to find the applicable terms.
Additional terms, permissive or non-permissive, may be stated in the form of a separately written license, or stated as exceptions; the above requirements apply either way.
8. Termination.
You may not propagate or modify a covered work except as expressly provided under this License. Any attempt otherwise to propagate or modify it is void, and will automatically terminate your rights under this License (including any patent licenses granted under the third paragraph of section 11).
However, if you cease all violation of this License, then your license from a particular copyright holder is reinstated (a) provisionally, unless and until the copyright holder explicitly and finally terminates your license, and (b) permanently, if the copyright holder fails to notify you of the violation by some reasonable means prior to 60 days after the cessation.
Moreover, your license from a particular copyright holder is reinstated permanently if the copyright holder notifies you of the violation by some reasonable means, this is the first time you have received notice of violation of this License (for any work) from that copyright holder, and you cure the violation prior to 30 days after your receipt of the notice.
Termination of your rights under this section does not terminate the licenses of parties who have received copies or rights from you under this License. If your rights have been terminated and not permanently reinstated, you do not qualify to receive new licenses for the same material under section 10.
9. Acceptance Not Required for Having Copies.
You are not required to accept this License in order to receive or run a copy of the Program. Ancillary propagation of a covered work occurring solely as a consequence of using peer-to-peer transmission to receive a copy likewise does not require acceptance. However, nothing other than this License grants you permission to propagate or modify any covered work. These actions infringe copyright if you do not accept this License. Therefore, by modifying or propagating a covered work, you indicate your acceptance of this License to do so.
10. Automatic Licensing of Downstream Recipients.
Each time you convey a covered work, the recipient automatically receives a license from the original licensors, to run, modify and propagate that work, subject to this License. You are not responsible for enforcing compliance by third parties with this License.
An “entity transaction” is a transaction transferring control of an organization, or substantially all assets of one, or subdividing an organization, or merging organizations. If propagation of a covered work results from an entity transaction, each party to that transaction who receives a copy of the work also receives whatever licenses to the work the party's predecessor in interest had or could give under the previous paragraph, plus a right to possession of the Corresponding Source of the work from the predecessor in interest, if the predecessor has it or can get it with reasonable efforts.
You may not impose any further restrictions on the exercise of the rights granted or affirmed under this License. For example, you may not impose a license fee, royalty, or other charge for exercise of rights granted under this License, and you may not initiate litigation (including a cross-claim or counterclaim in a lawsuit) alleging that any patent claim is infringed by making, using, selling, offering for sale, or importing the Program or any portion of it.
11. Patents.
A “contributor” is a copyright holder who authorizes use under this License of the Program or a work on which the Program is based. The work thus licensed is called the contributor's “contributor version”.
A contributor's “essential patent claims” are all patent claims owned or controlled by the contributor, whether already acquired or hereafter acquired, that would be infringed by some manner, permitted by this License, of making, using, or selling its contributor version, but do not include claims that would be infringed only as a consequence of further modification of the contributor version. For purposes of this definition, “control” includes the right to grant patent sublicenses in a manner consistent with the requirements of this License.
Each contributor grants you a non-exclusive, worldwide, royalty-free patent license under the contributor's essential patent claims, to make, use, sell, offer for sale, import and otherwise run, modify and propagate the contents of its contributor version.
In the following three paragraphs, a “patent license” is any express agreement or commitment, however denominated, not to enforce a patent (such as an express permission to practice a patent or covenant not to sue for patent infringement). To “grant” such a patent license to a party means to make such an agreement or commitment not to enforce a patent against the party.
If you convey a covered work, knowingly relying on a patent license, and the Corresponding Source of the work is not available for anyone to copy, free of charge and under the terms of this License, through a publicly available network server or other readily accessible means, then you must either (1) cause the Corresponding Source to be so available, or (2) arrange to deprive yourself of the benefit of the patent license for this particular work, or (3) arrange, in a manner consistent with the requirements of this License, to extend the patent license to downstream recipients. “Knowingly relying” means you have actual knowledge that, but for the patent license, your conveying the covered work in a country, or your recipient's use of the covered work in a country, would infringe one or more identifiable patents in that country that you have reason to believe are valid.
If, pursuant to or in connection with a single transaction or arrangement, you convey, or propagate by procuring conveyance of, a covered work, and grant a patent license to some of the parties receiving the covered work authorizing them to use, propagate, modify or convey a specific copy of the covered work, then the patent license you grant is automatically extended to all recipients of the covered work and works based on it.
A patent license is “discriminatory” if it does not include within the scope of its coverage, prohibits the exercise of, or is conditioned on the non-exercise of one or more of the rights that are specifically granted under this License. You may not convey a covered work if you are a party to an arrangement with a third party that is in the business of distributing software, under which you make payment to the third party based on the extent of your activity of conveying the work, and under which the third party grants, to any of the parties who would receive the covered work from you, a discriminatory patent license (a) in connection with copies of the covered work conveyed by you (or copies made from those copies), or (b) primarily for and in connection with specific products or compilations that contain the covered work, unless you entered into that arrangement, or that patent license was granted, prior to 28 March 2007.
Nothing in this License shall be construed as excluding or limiting any implied license or other defenses to infringement that may otherwise be available to you under applicable patent law.
12. No Surrender of Others' Freedom.
If conditions are imposed on you (whether by court order, agreement or otherwise) that contradict the conditions of this License, they do not excuse you from the conditions of this License. If you cannot convey a covered work so as to satisfy simultaneously your obligations under this License and any other pertinent obligations, then as a consequence you may not convey it at all. For example, if you agree to terms that obligate you to collect a royalty for further conveying from those to whom you convey the Program, the only way you could satisfy both those terms and this License would be to refrain entirely from conveying the Program.
13. Use with the GNU Affero General Public License.
Notwithstanding any other provision of this License, you have permission to link or combine any covered work with a work licensed under version 3 of the GNU Affero General Public License into a single combined work, and to convey the resulting work. The terms of this License will continue to apply to the part which is the covered work, but the special requirements of the GNU Affero General Public License, section 13, concerning interaction through a network will apply to the combination as such.
14. Revised Versions of this License.
The Free Software Foundation may publish revised and/or new versions of the GNU General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns.
Each version is given a distinguishing version number. If the Program specifies that a certain numbered version of the GNU General Public License “or any later version” applies to it, you have the option of following the terms and conditions either of that numbered version or of any later version published by the Free Software Foundation. If the Program does not specify a version number of the GNU General Public License, you may choose any version ever published by the Free Software Foundation.
If the Program specifies that a proxy can decide which future versions of the GNU General Public License can be used, that proxy's public statement of acceptance of a version permanently authorizes you to choose that version for the Program.
Later license versions may give you additional or different permissions. However, no additional obligations are imposed on any author or copyright holder as a result of your choosing to follow a later version.
15. Disclaimer of Warranty.
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM “AS IS” WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
16. Limitation of Liability.
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES.
17. Interpretation of Sections 15 and 16.
If the disclaimer of warranty and limitation of liability provided above cannot be given local legal effect according to their terms, reviewing courts shall apply local law that most closely approximates an absolute waiver of all civil liability in connection with the Program, unless a warranty or assumption of liability accompanies a copy of the Program in return for a fee.
END OF TERMS AND CONDITIONS
How to Apply These Terms to Your New Programs
If you develop a new program, and you want it to be of the greatest possible use to the public, the best way to achieve this is to make it free software which everyone can redistribute and change under these terms.
To do so, attach the following notices to the program. It is safest to attach them to the start of each source file to most effectively state the exclusion of warranty; and each file should have at least the “copyright” line and a pointer to where the full notice is found.
<one line to give the program's name and a brief idea of what it does.>
Copyright (C) <year> <name of author>
This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
You should have received a copy of the GNU General Public License along with this program. If not, see <https://www.gnu.org/licenses/>.
Also add information on how to contact you by electronic and paper mail.
If the program does terminal interaction, make it output a short notice like this when it starts in an interactive mode:
<program> Copyright (C) <year> <name of author>
This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
This is free software, and you are welcome to redistribute it under certain conditions; type `show c' for details.
The hypothetical commands `show w' and `show c' should show the appropriate parts of the General Public License. Of course, your program's commands might be different; for a GUI interface, you would use an “about box”.
You should also get your employer (if you work as a programmer) or school, if any, to sign a “copyright disclaimer” for the program, if necessary. For more information on this, and how to apply and follow the GNU GPL, see <https://www.gnu.org/licenses/>.
The GNU General Public License does not permit incorporating your program into proprietary programs. If your program is a subroutine library, you may consider it more useful to permit linking proprietary applications with the library. If this is what you want to do, use the GNU Lesser General Public License instead of this License. But first, please read <https://www.gnu.org/philosophy/why-not-lgpl.html>.

View File

@@ -2,7 +2,7 @@
name = "nominatim-api"
description = "A tool for building a database of OpenStreetMap for geocoding and for searching the database. Search library."
readme = "README.md"
requires-python = ">=3.7"
requires-python = ">=3.9"
license = 'GPL-3.0-or-later'
maintainers = [
{ name = "Sarah Hoffmann", email = "lonvia@denofr.de" },
@@ -15,6 +15,7 @@ classifiers = [
"Operating System :: OS Independent",
]
dependencies = [
"async-timeout",
"python-dotenv",
"pyYAML>=5.1",
"SQLAlchemy>=1.4.31",

View File

@@ -1 +0,0 @@
../../COPYING

View File

@@ -0,0 +1,232 @@
GNU GENERAL PUBLIC LICENSE
Version 3, 29 June 2007
Copyright © 2007 Free Software Foundation, Inc. <https://fsf.org/>
Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed.
Preamble
The GNU General Public License is a free, copyleft license for software and other kinds of works.
The licenses for most software and other practical works are designed to take away your freedom to share and change the works. By contrast, the GNU General Public License is intended to guarantee your freedom to share and change all versions of a program--to make sure it remains free software for all its users. We, the Free Software Foundation, use the GNU General Public License for most of our software; it applies also to any other work released this way by its authors. You can apply it to your programs, too.
When we speak of free software, we are referring to freedom, not price. Our General Public Licenses are designed to make sure that you have the freedom to distribute copies of free software (and charge for them if you wish), that you receive source code or can get it if you want it, that you can change the software or use pieces of it in new free programs, and that you know you can do these things.
To protect your rights, we need to prevent others from denying you these rights or asking you to surrender the rights. Therefore, you have certain responsibilities if you distribute copies of the software, or if you modify it: responsibilities to respect the freedom of others.
For example, if you distribute copies of such a program, whether gratis or for a fee, you must pass on to the recipients the same freedoms that you received. You must make sure that they, too, receive or can get the source code. And you must show them these terms so they know their rights.
Developers that use the GNU GPL protect your rights with two steps: (1) assert copyright on the software, and (2) offer you this License giving you legal permission to copy, distribute and/or modify it.
For the developers' and authors' protection, the GPL clearly explains that there is no warranty for this free software. For both users' and authors' sake, the GPL requires that modified versions be marked as changed, so that their problems will not be attributed erroneously to authors of previous versions.
Some devices are designed to deny users access to install or run modified versions of the software inside them, although the manufacturer can do so. This is fundamentally incompatible with the aim of protecting users' freedom to change the software. The systematic pattern of such abuse occurs in the area of products for individuals to use, which is precisely where it is most unacceptable. Therefore, we have designed this version of the GPL to prohibit the practice for those products. If such problems arise substantially in other domains, we stand ready to extend this provision to those domains in future versions of the GPL, as needed to protect the freedom of users.
Finally, every program is threatened constantly by software patents. States should not allow patents to restrict development and use of software on general-purpose computers, but in those that do, we wish to avoid the special danger that patents applied to a free program could make it effectively proprietary. To prevent this, the GPL assures that patents cannot be used to render the program non-free.
The precise terms and conditions for copying, distribution and modification follow.
TERMS AND CONDITIONS
0. Definitions.
“This License” refers to version 3 of the GNU General Public License.
“Copyright” also means copyright-like laws that apply to other kinds of works, such as semiconductor masks.
“The Program” refers to any copyrightable work licensed under this License. Each licensee is addressed as “you”. “Licensees” and “recipients” may be individuals or organizations.
To “modify” a work means to copy from or adapt all or part of the work in a fashion requiring copyright permission, other than the making of an exact copy. The resulting work is called a “modified version” of the earlier work or a work “based on” the earlier work.
A “covered work” means either the unmodified Program or a work based on the Program.
To “propagate” a work means to do anything with it that, without permission, would make you directly or secondarily liable for infringement under applicable copyright law, except executing it on a computer or modifying a private copy. Propagation includes copying, distribution (with or without modification), making available to the public, and in some countries other activities as well.
To “convey” a work means any kind of propagation that enables other parties to make or receive copies. Mere interaction with a user through a computer network, with no transfer of a copy, is not conveying.
An interactive user interface displays “Appropriate Legal Notices” to the extent that it includes a convenient and prominently visible feature that (1) displays an appropriate copyright notice, and (2) tells the user that there is no warranty for the work (except to the extent that warranties are provided), that licensees may convey the work under this License, and how to view a copy of this License. If the interface presents a list of user commands or options, such as a menu, a prominent item in the list meets this criterion.
1. Source Code.
The “source code” for a work means the preferred form of the work for making modifications to it. “Object code” means any non-source form of a work.
A “Standard Interface” means an interface that either is an official standard defined by a recognized standards body, or, in the case of interfaces specified for a particular programming language, one that is widely used among developers working in that language.
The “System Libraries” of an executable work include anything, other than the work as a whole, that (a) is included in the normal form of packaging a Major Component, but which is not part of that Major Component, and (b) serves only to enable use of the work with that Major Component, or to implement a Standard Interface for which an implementation is available to the public in source code form. A “Major Component”, in this context, means a major essential component (kernel, window system, and so on) of the specific operating system (if any) on which the executable work runs, or a compiler used to produce the work, or an object code interpreter used to run it.
The “Corresponding Source” for a work in object code form means all the source code needed to generate, install, and (for an executable work) run the object code and to modify the work, including scripts to control those activities. However, it does not include the work's System Libraries, or general-purpose tools or generally available free programs which are used unmodified in performing those activities but which are not part of the work. For example, Corresponding Source includes interface definition files associated with source files for the work, and the source code for shared libraries and dynamically linked subprograms that the work is specifically designed to require, such as by intimate data communication or control flow between those subprograms and other parts of the work.
The Corresponding Source need not include anything that users can regenerate automatically from other parts of the Corresponding Source.
The Corresponding Source for a work in source code form is that same work.
2. Basic Permissions.
All rights granted under this License are granted for the term of copyright on the Program, and are irrevocable provided the stated conditions are met. This License explicitly affirms your unlimited permission to run the unmodified Program. The output from running a covered work is covered by this License only if the output, given its content, constitutes a covered work. This License acknowledges your rights of fair use or other equivalent, as provided by copyright law.
You may make, run and propagate covered works that you do not convey, without conditions so long as your license otherwise remains in force. You may convey covered works to others for the sole purpose of having them make modifications exclusively for you, or provide you with facilities for running those works, provided that you comply with the terms of this License in conveying all material for which you do not control copyright. Those thus making or running the covered works for you must do so exclusively on your behalf, under your direction and control, on terms that prohibit them from making any copies of your copyrighted material outside their relationship with you.
Conveying under any other circumstances is permitted solely under the conditions stated below. Sublicensing is not allowed; section 10 makes it unnecessary.
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
No covered work shall be deemed part of an effective technological measure under any applicable law fulfilling obligations under article 11 of the WIPO copyright treaty adopted on 20 December 1996, or similar laws prohibiting or restricting circumvention of such measures.
When you convey a covered work, you waive any legal power to forbid circumvention of technological measures to the extent such circumvention is effected by exercising rights under this License with respect to the covered work, and you disclaim any intention to limit operation or modification of the work as a means of enforcing, against the work's users, your or third parties' legal rights to forbid circumvention of technological measures.
4. Conveying Verbatim Copies.
You may convey verbatim copies of the Program's source code as you receive it, in any medium, provided that you conspicuously and appropriately publish on each copy an appropriate copyright notice; keep intact all notices stating that this License and any non-permissive terms added in accord with section 7 apply to the code; keep intact all notices of the absence of any warranty; and give all recipients a copy of this License along with the Program.
You may charge any price or no price for each copy that you convey, and you may offer support or warranty protection for a fee.
5. Conveying Modified Source Versions.
You may convey a work based on the Program, or the modifications to produce it from the Program, in the form of source code under the terms of section 4, provided that you also meet all of these conditions:
a) The work must carry prominent notices stating that you modified it, and giving a relevant date.
b) The work must carry prominent notices stating that it is released under this License and any conditions added under section 7. This requirement modifies the requirement in section 4 to “keep intact all notices”.
c) You must license the entire work, as a whole, under this License to anyone who comes into possession of a copy. This License will therefore apply, along with any applicable section 7 additional terms, to the whole of the work, and all its parts, regardless of how they are packaged. This License gives no permission to license the work in any other way, but it does not invalidate such permission if you have separately received it.
d) If the work has interactive user interfaces, each must display Appropriate Legal Notices; however, if the Program has interactive interfaces that do not display Appropriate Legal Notices, your work need not make them do so.
A compilation of a covered work with other separate and independent works, which are not by their nature extensions of the covered work, and which are not combined with it such as to form a larger program, in or on a volume of a storage or distribution medium, is called an “aggregate” if the compilation and its resulting copyright are not used to limit the access or legal rights of the compilation's users beyond what the individual works permit. Inclusion of a covered work in an aggregate does not cause this License to apply to the other parts of the aggregate.
6. Conveying Non-Source Forms.
You may convey a covered work in object code form under the terms of sections 4 and 5, provided that you also convey the machine-readable Corresponding Source under the terms of this License, in one of these ways:
a) Convey the object code in, or embodied in, a physical product (including a physical distribution medium), accompanied by the Corresponding Source fixed on a durable physical medium customarily used for software interchange.
b) Convey the object code in, or embodied in, a physical product (including a physical distribution medium), accompanied by a written offer, valid for at least three years and valid for as long as you offer spare parts or customer support for that product model, to give anyone who possesses the object code either (1) a copy of the Corresponding Source for all the software in the product that is covered by this License, on a durable physical medium customarily used for software interchange, for a price no more than your reasonable cost of physically performing this conveying of source, or (2) access to copy the Corresponding Source from a network server at no charge.
c) Convey individual copies of the object code with a copy of the written offer to provide the Corresponding Source. This alternative is allowed only occasionally and noncommercially, and only if you received the object code with such an offer, in accord with subsection 6b.
d) Convey the object code by offering access from a designated place (gratis or for a charge), and offer equivalent access to the Corresponding Source in the same way through the same place at no further charge. You need not require recipients to copy the Corresponding Source along with the object code. If the place to copy the object code is a network server, the Corresponding Source may be on a different server (operated by you or a third party) that supports equivalent copying facilities, provided you maintain clear directions next to the object code saying where to find the Corresponding Source. Regardless of what server hosts the Corresponding Source, you remain obligated to ensure that it is available for as long as needed to satisfy these requirements.
e) Convey the object code using peer-to-peer transmission, provided you inform other peers where the object code and Corresponding Source of the work are being offered to the general public at no charge under subsection 6d.
A separable portion of the object code, whose source code is excluded from the Corresponding Source as a System Library, need not be included in conveying the object code work.
A “User Product” is either (1) a “consumer product”, which means any tangible personal property which is normally used for personal, family, or household purposes, or (2) anything designed or sold for incorporation into a dwelling. In determining whether a product is a consumer product, doubtful cases shall be resolved in favor of coverage. For a particular product received by a particular user, “normally used” refers to a typical or common use of that class of product, regardless of the status of the particular user or of the way in which the particular user actually uses, or expects or is expected to use, the product. A product is a consumer product regardless of whether the product has substantial commercial, industrial or non-consumer uses, unless such uses represent the only significant mode of use of the product.
“Installation Information” for a User Product means any methods, procedures, authorization keys, or other information required to install and execute modified versions of a covered work in that User Product from a modified version of its Corresponding Source. The information must suffice to ensure that the continued functioning of the modified object code is in no case prevented or interfered with solely because modification has been made.
If you convey an object code work under this section in, or with, or specifically for use in, a User Product, and the conveying occurs as part of a transaction in which the right of possession and use of the User Product is transferred to the recipient in perpetuity or for a fixed term (regardless of how the transaction is characterized), the Corresponding Source conveyed under this section must be accompanied by the Installation Information. But this requirement does not apply if neither you nor any third party retains the ability to install modified object code on the User Product (for example, the work has been installed in ROM).
The requirement to provide Installation Information does not include a requirement to continue to provide support service, warranty, or updates for a work that has been modified or installed by the recipient, or for the User Product in which it has been modified or installed. Access to a network may be denied when the modification itself materially and adversely affects the operation of the network or violates the rules and protocols for communication across the network.
Corresponding Source conveyed, and Installation Information provided, in accord with this section must be in a format that is publicly documented (and with an implementation available to the public in source code form), and must require no special password or key for unpacking, reading or copying.
7. Additional Terms.
“Additional permissions” are terms that supplement the terms of this License by making exceptions from one or more of its conditions. Additional permissions that are applicable to the entire Program shall be treated as though they were included in this License, to the extent that they are valid under applicable law. If additional permissions apply only to part of the Program, that part may be used separately under those permissions, but the entire Program remains governed by this License without regard to the additional permissions.
When you convey a copy of a covered work, you may at your option remove any additional permissions from that copy, or from any part of it. (Additional permissions may be written to require their own removal in certain cases when you modify the work.) You may place additional permissions on material, added by you to a covered work, for which you have or can give appropriate copyright permission.
Notwithstanding any other provision of this License, for material you add to a covered work, you may (if authorized by the copyright holders of that material) supplement the terms of this License with terms:
a) Disclaiming warranty or limiting liability differently from the terms of sections 15 and 16 of this License; or
b) Requiring preservation of specified reasonable legal notices or author attributions in that material or in the Appropriate Legal Notices displayed by works containing it; or
c) Prohibiting misrepresentation of the origin of that material, or requiring that modified versions of such material be marked in reasonable ways as different from the original version; or
d) Limiting the use for publicity purposes of names of licensors or authors of the material; or
e) Declining to grant rights under trademark law for use of some trade names, trademarks, or service marks; or
f) Requiring indemnification of licensors and authors of that material by anyone who conveys the material (or modified versions of it) with contractual assumptions of liability to the recipient, for any liability that these contractual assumptions directly impose on those licensors and authors.
All other non-permissive additional terms are considered “further restrictions” within the meaning of section 10. If the Program as you received it, or any part of it, contains a notice stating that it is governed by this License along with a term that is a further restriction, you may remove that term. If a license document contains a further restriction but permits relicensing or conveying under this License, you may add to a covered work material governed by the terms of that license document, provided that the further restriction does not survive such relicensing or conveying.
If you add terms to a covered work in accord with this section, you must place, in the relevant source files, a statement of the additional terms that apply to those files, or a notice indicating where to find the applicable terms.
Additional terms, permissive or non-permissive, may be stated in the form of a separately written license, or stated as exceptions; the above requirements apply either way.
8. Termination.
You may not propagate or modify a covered work except as expressly provided under this License. Any attempt otherwise to propagate or modify it is void, and will automatically terminate your rights under this License (including any patent licenses granted under the third paragraph of section 11).
However, if you cease all violation of this License, then your license from a particular copyright holder is reinstated (a) provisionally, unless and until the copyright holder explicitly and finally terminates your license, and (b) permanently, if the copyright holder fails to notify you of the violation by some reasonable means prior to 60 days after the cessation.
Moreover, your license from a particular copyright holder is reinstated permanently if the copyright holder notifies you of the violation by some reasonable means, this is the first time you have received notice of violation of this License (for any work) from that copyright holder, and you cure the violation prior to 30 days after your receipt of the notice.
Termination of your rights under this section does not terminate the licenses of parties who have received copies or rights from you under this License. If your rights have been terminated and not permanently reinstated, you do not qualify to receive new licenses for the same material under section 10.
9. Acceptance Not Required for Having Copies.
You are not required to accept this License in order to receive or run a copy of the Program. Ancillary propagation of a covered work occurring solely as a consequence of using peer-to-peer transmission to receive a copy likewise does not require acceptance. However, nothing other than this License grants you permission to propagate or modify any covered work. These actions infringe copyright if you do not accept this License. Therefore, by modifying or propagating a covered work, you indicate your acceptance of this License to do so.
10. Automatic Licensing of Downstream Recipients.
Each time you convey a covered work, the recipient automatically receives a license from the original licensors, to run, modify and propagate that work, subject to this License. You are not responsible for enforcing compliance by third parties with this License.
An “entity transaction” is a transaction transferring control of an organization, or substantially all assets of one, or subdividing an organization, or merging organizations. If propagation of a covered work results from an entity transaction, each party to that transaction who receives a copy of the work also receives whatever licenses to the work the party's predecessor in interest had or could give under the previous paragraph, plus a right to possession of the Corresponding Source of the work from the predecessor in interest, if the predecessor has it or can get it with reasonable efforts.
You may not impose any further restrictions on the exercise of the rights granted or affirmed under this License. For example, you may not impose a license fee, royalty, or other charge for exercise of rights granted under this License, and you may not initiate litigation (including a cross-claim or counterclaim in a lawsuit) alleging that any patent claim is infringed by making, using, selling, offering for sale, or importing the Program or any portion of it.
11. Patents.
A “contributor” is a copyright holder who authorizes use under this License of the Program or a work on which the Program is based. The work thus licensed is called the contributor's “contributor version”.
A contributor's “essential patent claims” are all patent claims owned or controlled by the contributor, whether already acquired or hereafter acquired, that would be infringed by some manner, permitted by this License, of making, using, or selling its contributor version, but do not include claims that would be infringed only as a consequence of further modification of the contributor version. For purposes of this definition, “control” includes the right to grant patent sublicenses in a manner consistent with the requirements of this License.
Each contributor grants you a non-exclusive, worldwide, royalty-free patent license under the contributor's essential patent claims, to make, use, sell, offer for sale, import and otherwise run, modify and propagate the contents of its contributor version.
In the following three paragraphs, a “patent license” is any express agreement or commitment, however denominated, not to enforce a patent (such as an express permission to practice a patent or covenant not to sue for patent infringement). To “grant” such a patent license to a party means to make such an agreement or commitment not to enforce a patent against the party.
If you convey a covered work, knowingly relying on a patent license, and the Corresponding Source of the work is not available for anyone to copy, free of charge and under the terms of this License, through a publicly available network server or other readily accessible means, then you must either (1) cause the Corresponding Source to be so available, or (2) arrange to deprive yourself of the benefit of the patent license for this particular work, or (3) arrange, in a manner consistent with the requirements of this License, to extend the patent license to downstream recipients. “Knowingly relying” means you have actual knowledge that, but for the patent license, your conveying the covered work in a country, or your recipient's use of the covered work in a country, would infringe one or more identifiable patents in that country that you have reason to believe are valid.
If, pursuant to or in connection with a single transaction or arrangement, you convey, or propagate by procuring conveyance of, a covered work, and grant a patent license to some of the parties receiving the covered work authorizing them to use, propagate, modify or convey a specific copy of the covered work, then the patent license you grant is automatically extended to all recipients of the covered work and works based on it.
A patent license is “discriminatory” if it does not include within the scope of its coverage, prohibits the exercise of, or is conditioned on the non-exercise of one or more of the rights that are specifically granted under this License. You may not convey a covered work if you are a party to an arrangement with a third party that is in the business of distributing software, under which you make payment to the third party based on the extent of your activity of conveying the work, and under which the third party grants, to any of the parties who would receive the covered work from you, a discriminatory patent license (a) in connection with copies of the covered work conveyed by you (or copies made from those copies), or (b) primarily for and in connection with specific products or compilations that contain the covered work, unless you entered into that arrangement, or that patent license was granted, prior to 28 March 2007.
Nothing in this License shall be construed as excluding or limiting any implied license or other defenses to infringement that may otherwise be available to you under applicable patent law.
12. No Surrender of Others' Freedom.
If conditions are imposed on you (whether by court order, agreement or otherwise) that contradict the conditions of this License, they do not excuse you from the conditions of this License. If you cannot convey a covered work so as to satisfy simultaneously your obligations under this License and any other pertinent obligations, then as a consequence you may not convey it at all. For example, if you agree to terms that obligate you to collect a royalty for further conveying from those to whom you convey the Program, the only way you could satisfy both those terms and this License would be to refrain entirely from conveying the Program.
13. Use with the GNU Affero General Public License.
Notwithstanding any other provision of this License, you have permission to link or combine any covered work with a work licensed under version 3 of the GNU Affero General Public License into a single combined work, and to convey the resulting work. The terms of this License will continue to apply to the part which is the covered work, but the special requirements of the GNU Affero General Public License, section 13, concerning interaction through a network will apply to the combination as such.
14. Revised Versions of this License.
The Free Software Foundation may publish revised and/or new versions of the GNU General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns.
Each version is given a distinguishing version number. If the Program specifies that a certain numbered version of the GNU General Public License “or any later version” applies to it, you have the option of following the terms and conditions either of that numbered version or of any later version published by the Free Software Foundation. If the Program does not specify a version number of the GNU General Public License, you may choose any version ever published by the Free Software Foundation.
If the Program specifies that a proxy can decide which future versions of the GNU General Public License can be used, that proxy's public statement of acceptance of a version permanently authorizes you to choose that version for the Program.
Later license versions may give you additional or different permissions. However, no additional obligations are imposed on any author or copyright holder as a result of your choosing to follow a later version.
15. Disclaimer of Warranty.
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM “AS IS” WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
16. Limitation of Liability.
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES.
17. Interpretation of Sections 15 and 16.
If the disclaimer of warranty and limitation of liability provided above cannot be given local legal effect according to their terms, reviewing courts shall apply local law that most closely approximates an absolute waiver of all civil liability in connection with the Program, unless a warranty or assumption of liability accompanies a copy of the Program in return for a fee.
END OF TERMS AND CONDITIONS
How to Apply These Terms to Your New Programs
If you develop a new program, and you want it to be of the greatest possible use to the public, the best way to achieve this is to make it free software which everyone can redistribute and change under these terms.
To do so, attach the following notices to the program. It is safest to attach them to the start of each source file to most effectively state the exclusion of warranty; and each file should have at least the “copyright” line and a pointer to where the full notice is found.
<one line to give the program's name and a brief idea of what it does.>
Copyright (C) <year> <name of author>
This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
You should have received a copy of the GNU General Public License along with this program. If not, see <https://www.gnu.org/licenses/>.
Also add information on how to contact you by electronic and paper mail.
If the program does terminal interaction, make it output a short notice like this when it starts in an interactive mode:
<program> Copyright (C) <year> <name of author>
This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
This is free software, and you are welcome to redistribute it under certain conditions; type `show c' for details.
The hypothetical commands `show w' and `show c' should show the appropriate parts of the General Public License. Of course, your program's commands might be different; for a GUI interface, you would use an “about box”.
You should also get your employer (if you work as a programmer) or school, if any, to sign a “copyright disclaimer” for the program, if necessary. For more information on this, and how to apply and follow the GNU GPL, see <https://www.gnu.org/licenses/>.
The GNU General Public License does not permit incorporating your program into proprietary programs. If your program is a subroutine library, you may consider it more useful to permit linking proprietary applications with the library. If this is what you want to do, use the GNU Lesser General Public License instead of this License. But first, please read <https://www.gnu.org/philosophy/why-not-lgpl.html>.

View File

@@ -2,7 +2,7 @@
name = "nominatim-db"
description = "A tool for building a database of OpenStreetMap for geocoding and for searching the database. Database backend."
readme = "README.md"
requires-python = ">=3.7"
requires-python = ">=3.9"
license = 'GPL-3.0-or-later'
maintainers = [
{ name = "Sarah Hoffmann", email = "lonvia@denofr.de" },
@@ -15,13 +15,13 @@ classifiers = [
"Operating System :: OS Independent",
]
dependencies = [
"psycopg",
"psycopg != 3.3.0",
"python-dotenv",
"jinja2",
"pyYAML>=5.1",
"datrie",
"psutil",
"PyICU"
"PyICU",
"mwparserfromhell"
]
dynamic = ["version"]

View File

@@ -2,4 +2,4 @@
from nominatim_db import cli
exit(cli.nominatim(osm2pgsql_path=None))
exit(cli.nominatim())

View File

@@ -21,10 +21,10 @@
"croft" : 20,
"subdivision" : 22,
"allotments" : 22,
"neighbourhood" : [20, 22],
"neighbourhood" : 24,
"quarter" : [20, 22],
"isolated_dwelling" : [22, 20],
"farm" : [22, 20],
"isolated_dwelling" : [22, 25],
"farm" : [22, 25],
"city_block" : 25,
"mountain_pass" : 25,
"square" : 25,
@@ -48,17 +48,20 @@
"" : [25, 0]
},
"landuse" : {
"residential" : 22,
"farm" : 22,
"farmyard" : 22,
"industrial" : 22,
"commercial" : 22,
"allotments" : 22,
"retail" : 22,
"" : [22, 0]
"residential" : 24,
"farm" : 24,
"farmyard" : 24,
"industrial" : 24,
"commercial" : 24,
"allotments" : 24,
"retail" : 24,
"" : [24, 0]
},
"leisure" : {
"park" : [24, 0]
"park" : [24, 0],
"nature_reserve" : [24, 0],
"garden": [25, 0],
"common": [25, 0]
},
"natural" : {
"peak" : [18, 0],
@@ -216,6 +219,14 @@
}
}
},
{ "countries" : ["sa"],
"tags" : {
"place" : {
"province" : 12,
"municipality" : 18
}
}
},
{ "countries" : ["sk"],
"tags" : {
"boundary" : {
@@ -228,6 +239,17 @@
"administrative11" : 20
}
}
},
{ "countries" : ["jp"],
"tags" : {
"boundary" : {
"administrative7" : 16,
"administrative8" : 18,
"administrative9" : 20,
"administrative10" : 22,
"administrative11" : 24
}
}
}
]

View File

@@ -97,7 +97,6 @@ name:
na: Andorra
ne: एण्डोरा
nl: Andorra
nn: Andorra
"no": Andorra
nv: Andówa
oc: Andòrra

View File

@@ -74,6 +74,7 @@ name:
mt: Emirati Għarab Magħquda
my: အာရပ်စော်ဘွားများပြည်ထောင်စုနိုင်ငံ
na: Emireitit Arabiya
nb: De forente arabiske emirater
ne: संयुक्त अरब इमिरेट्स
nl: Verenigde Arabische Emiraten
nn: Dei sameinte arabiske emirata

View File

@@ -96,7 +96,6 @@ name:
na: Apeganitan
ne: अफगानिस्तान
nl: Afghanistan
nn: Afghanistan
"no": Afghanistan
oc: Afganistan
om: Afgaanistaan

View File

@@ -86,10 +86,8 @@ name:
mt: Antigwa u Barbuda
my: အင်တီဂွါနှင့် ဘာဘူဒါ
na: Antigua me Barbuda
nb: Antigua og Barbuda
ne: एन्टिगुआ र बर्बुडा
nl: Antigua en Barbuda
nn: Antigua og Barbuda
"no": Antigua og Barbuda
nv: Antíígwa dóó Hashkʼaan Bikéyah Yázhí
oc: Antigua e Barbuda

View File

@@ -50,7 +50,6 @@ name:
ms: Anguilla
ne: एन्गुला
nl: Anguilla
nn: Anguilla
"no": Anguilla
oc: Anguilla
pa: ਐਂਗੁਈਲਾ

View File

@@ -101,7 +101,6 @@ name:
na: Arbainiya
ne: अल्बानिया
nl: Albanië
nn: Albania
"no": Albania
nv: Dziłigaii Bikéyah
oc: Albania

View File

@@ -99,7 +99,6 @@ name:
na: Arminiya
ne: आर्मेनिया
nl: Armenië
nn: Armenia
"no": Armenia
nv: Aooméénii Bikéyah
oc: Armenia

View File

@@ -90,7 +90,6 @@ name:
na: Angora
ne: अंगोला
nl: Angola
nn: Angola
"no": Angola
nv: Angóola
ny: Angola

View File

@@ -99,7 +99,6 @@ name:
na: Ardjentina
ne: अर्जेन्टिना
nl: Argentinië
nn: Argentina
"no": Argentina
nv: Béésh Łigaii Bikéyah
oc: Argentina

View File

@@ -93,6 +93,7 @@ name:
mt: Awstrija
my: သြစတြီးယားနိုင်ငံ
na: Oteriya
nb: Østerrike
ne: अष्ट्रीया
nl: Oostenrijk
nn: Austerrike

View File

@@ -94,7 +94,6 @@ name:
na: Otereiriya
ne: अष्ट्रेलिया
nl: Australië
nn: Australia
"no": Australia
nv: Nahatʼeʼiitsoh Bikéyah
oc: Austràlia

View File

@@ -96,10 +96,8 @@ name:
mt: Ażerbajġan
my: အဇာဘိုင်ဂျန်နိုင်ငံ
na: Aderbaidjan
nb: Aserbajdsjan
ne: अजरबैजान
nl: Azerbeidzjan
nn: Aserbajdsjan
"no": Aserbajdsjan
nv: Azééwii Bikéyah
ny: Azerbaijan

View File

@@ -94,7 +94,6 @@ name:
na: Boteniya me Erdegobina
ne: बोस्निया र हर्जगोभिना
nl: Bosnië en Herzegovina
nn: Bosnia-Hercegovina
"no": Bosnia-Hercegovina
nv: Bosna dóó Hetsog Bikéyah
oc: Bòsnia e Ercegovina

View File

@@ -77,7 +77,6 @@ name:
na: Barbadot
ne: बार्बाडोस
nl: Barbados
nn: Barbados
"no": Barbados
oc: Barbados
om: Baarbeedoos

View File

@@ -29,6 +29,7 @@ name:
lt: Bangladešas
lv: Bangladeša
mn: Бангладеш
"no": Bangladesh
pl: Bangladesz
pt: Bangladesh
ru: Бангладеш

View File

@@ -95,10 +95,8 @@ name:
mt: Belġju
my: ဘယ်လ်ဂျီယမ်နိုင်ငံ
na: Berdjiyum
nb: Belgia
ne: बेल्जियम
nl: België
nn: Belgia
"no": Belgia
oc: Belgica
om: Beeljiyeem

View File

@@ -88,7 +88,6 @@ name:
na: Burkinabato
ne: बुर्किना फासो
nl: Burkina Faso
nn: Burkina Faso
"no": Burkina Faso
oc: Burkina Faso
om: Burkinaa Faasoo

View File

@@ -93,7 +93,6 @@ name:
na: Borgeriya
ne: बुल्गेरिया
nl: Bulgarije
nn: Bulgaria
"no": Bulgaria
nv: Bálgaa Bikéyah
oc: Bulgaria

View File

@@ -91,7 +91,6 @@ name:
na: Bahrain
ne: बहराइन
nl: Bahrein
nn: Bahrain
"no": Bahrain
oc: Bahrayn
om: Baahireen

View File

@@ -88,7 +88,6 @@ name:
na: Burundi
ne: बुरूण्डी
nl: Burundi
nn: Burundi
"no": Burundi
oc: Burundi
om: Buruundii

View File

@@ -91,7 +91,6 @@ name:
na: Benin
ne: बेनिन
nl: Benin
nn: Benin
"no": Benin
oc: Benin
om: Beeniin

View File

@@ -25,6 +25,7 @@ name:
lv: Bermudu salas
mk: Бермуда
mn: Бермудын Арал
"no": Bermuda
oc: Bermudas
pl: Bermudy
pt: Bermudas

View File

@@ -91,7 +91,6 @@ name:
na: Brunei
ne: ब्रुनेई
nl: Brunei
nn: Brunei
"no": Brunei
ny: Brunei
oc: Brunei

View File

@@ -94,7 +94,6 @@ name:
na: Boribiya
ne: बोलिभिया
nl: Bolivia
nn: Bolivia
"no": Bolivia
nv: Bolíbiya
oc: Bolívia

View File

@@ -99,7 +99,6 @@ name:
na: Bradir
ne: ब्राजिल
nl: Brazilië
nn: Brasil
"no": Brasil
nv: Bwazííl
oc: Brasil

View File

@@ -85,7 +85,6 @@ name:
na: Bahamat
ne: बहामस
nl: Bahama's
nn: Bahamas
"no": Bahamas
oc: Las Bahamas
om: Bahamaas

View File

@@ -92,7 +92,6 @@ name:
na: Butan
ne: भूटान
nl: Bhutan
nn: Bhutan
"no": Bhutan
nv: Bikéyah
oc: Botan

View File

@@ -90,7 +90,6 @@ name:
na: Botwana
ne: बोत्स्वाना
nl: Botswana
nn: Botswana
"no": Botswana
nv: Tswana Dineʼé Bikéyah
oc: Botswana

View File

@@ -83,8 +83,7 @@ name:
na: Berarut
ne: बेलारुस
nl: Wit-Rusland
nn: Kviterussland
"no": Hviterussland
"no": Belarus
oc: Bielorussia
or: ବେଲାଋଷ
os: Белорусси

View File

@@ -85,7 +85,6 @@ name:
na: Berij
ne: बेलिज
nl: Belize
nn: Belize
"no": Belize
oc: Belize
om: Beliiz

View File

@@ -94,7 +94,6 @@ name:
na: Kanada
ne: क्यानाडा
nl: Canada
nn: Canada
"no": Canada
nv: Deeteel Bikéyah
oc: Canadà

View File

@@ -74,7 +74,6 @@ name:
na: Ripubrikit Engame Kongo
ne: प्रजातान्त्रिक गणतन्त्र कंगो
nl: Democratische Republiek Congo
nn: Den demokratiske republikken Kongo
"no": Den demokratiske republikken Kongo
nv: Kéyah Káango Shádiʼááhjí Siʼánígíí
oc: Republica Democratica de Còngo

View File

@@ -72,6 +72,7 @@ name:
mt: Repubblika Ċentru-Afrikana
my: ဗဟိုအာဖရိကသမ္မတနိုင်ငံ
na: Ripubrikin Aprika Yugaga
nb: Den sentralafrikanske republikk
ne: मध्य अफ्रिकी गणतन्त्र
nl: Centraal-Afrikaanse Republiek
nn: Den sentralafrikanske republikken

View File

@@ -70,7 +70,6 @@ name:
na: Ripubrikin Kongo
ne: कङ्गो
nl: Congo-Brazzaville
nn: Kongo-Brazzaville
"no": Republikken Kongo
nv: Kéyah Káango Náhookǫsjí Siʼánígíí
oc: Republica de Còngo

View File

@@ -84,7 +84,6 @@ name:
na: Switzerland
ne: स्विजरल्याण्ड
nl: Zwitserland
nn: Sveits
"no": Sveits
nv: Swis Bikéyah
oc: Soïssa

View File

@@ -75,6 +75,7 @@ name:
mt: Kosta tal-Avorju
my: အိုင်ဗရီကို့စ်နိုင်ငံ
na: Aibori Kot
nb: Elfenbenskysten
ne: आइभरी कोस्ट
nl: Ivoorkust
nn: Elfenbeinskysten

Some files were not shown because too many files have changed in this diff Show More