Compare commits

..

22 Commits

Author SHA1 Message Date
Sarah Hoffmann
ec533f6a1a prepare release 4.1.1 2022-11-19 16:15:47 +01:00
Sarah Hoffmann
9f5adabd12 update osm2pgsql to 1.7.1 2022-11-19 15:54:27 +01:00
Sarah Hoffmann
3d9c33192b drop illegal values for addr:interpolation on update 2022-11-19 15:53:29 +01:00
Sarah Hoffmann
05863ae5ca correctly handle special term + name combination
Special terms with operator name usually appear in combination with the
name. The current penalties only took name + special term into account
not special term + name.

Fixes #2876.
2022-11-19 15:52:19 +01:00
Sarah Hoffmann
a856c56450 fix type issues with calls to pyosmium 2022-11-19 15:51:09 +01:00
Marc Tobias
aa2e4e411b Tiger install doc: add -refresh website- step 2022-11-19 15:51:02 +01:00
Sarah Hoffmann
5cdeaac967 add types-requests dependency 2022-11-19 15:50:45 +01:00
Sarah Hoffmann
6a7b2b823a respect socket timeout also in other replication functions 2022-11-19 15:50:38 +01:00
Sarah Hoffmann
2dd8433ab6 fix timeout use for replication timeout
The timeout parameter is no longer taken into account since
pyosmium switched to the requests library. This adds the parameter
back.
2022-11-19 15:50:30 +01:00
Marc Tobias
951f92f665 update those github action packages still using node12 2022-11-19 15:49:58 +01:00
Sarah Hoffmann
9d009c7967 ignore interpolations without parent on reverse search
If no parent can be found for an interpolation, there is most
likely a data error involved. So don' t show these interpolations
in reverse search results.
2022-11-19 15:49:17 +01:00
marc tobias
442e8fb411 Install scripts: remove version from /var/run/php-fpm filenames 2022-11-19 15:48:52 +01:00
Sarah Hoffmann
6a5bbdfae0 actions: pin pyicu to 2.9 2022-11-19 15:48:30 +01:00
marc tobias
6bac238760 Documentation: remove year from TIGER filename 2022-11-19 15:47:05 +01:00
Sarah Hoffmann
185c3cf7a8 mypy: fix new warnings due to external type updates 2022-11-19 15:45:20 +01:00
Mauricio Scheffer
ae5687539a docs: fix links to rank docs 2022-11-19 15:44:58 +01:00
Sarah Hoffmann
d71be2b60a ignore irrelevant extra tags on address interpolations
When deciding if an address interpolation has address information, only
look for addr:street and addr:place. If they are not there go looking
for the address on the address nodes. Ignores irrelevant tags like
addr:inclusion.

Fixes #2797.
2022-11-19 15:44:20 +01:00
Sarah Hoffmann
d910f52221 more invalidations when boundary changes rank
When a boundary or place changes its address rank, all places where
it participates as address need to be potentially reindexed.
Also use the computed rank when testing place nodes against
boundaries. Boundaries are computed earlier.

Fixes #2794.
2022-11-19 15:43:08 +01:00
Sarah Hoffmann
f48a37deea fix base number of returned results
The intent was to always search for at least 10 results.

Improves on #882.
2022-11-19 15:40:43 +01:00
Sarah Hoffmann
c08e3849b8 adapt to new type annotations from typeshed
Some more functions frrom psycopg are now properly annotated.
No ignoring necessary anymore.
2022-11-19 15:40:01 +01:00
Sarah Hoffmann
ec92167514 docs: add types-psutil requirement 2022-11-19 15:39:47 +01:00
Sarah Hoffmann
5a05608b34 remove mypy ignore for psutil.virtual_memory()
Now available in typeshed.
2022-11-19 15:39:09 +01:00
125 changed files with 1264 additions and 3349 deletions

2
.github/FUNDING.yml vendored
View File

@@ -1,2 +0,0 @@
github: lonvia
custom: "https://nominatim.org/funding/"

View File

@@ -9,10 +9,6 @@ inputs:
description: 'Additional options to hand to cmake'
required: false
default: ''
lua:
description: 'Version of Lua to use'
required: false
default: '5.3'
runs:
using: "composite"
@@ -25,7 +21,7 @@ runs:
shell: bash
- name: Install prerequisites
run: |
sudo apt-get install -y -qq libboost-system-dev libboost-filesystem-dev libexpat1-dev zlib1g-dev libbz2-dev libpq-dev libproj-dev libicu-dev liblua${LUA_VERSION}-dev lua${LUA_VERSION}
sudo apt-get install -y -qq libboost-system-dev libboost-filesystem-dev libexpat1-dev zlib1g-dev libbz2-dev libpq-dev libproj-dev libicu-dev
if [ "x$UBUNTUVER" == "x18" ]; then
pip3 install python-dotenv psycopg2==2.7.7 jinja2==2.8 psutil==5.4.2 pyicu==2.9 osmium PyYAML==5.1 datrie
else
@@ -35,7 +31,6 @@ runs:
env:
UBUNTUVER: ${{ inputs.ubuntu }}
CMAKE_ARGS: ${{ inputs.cmake-args }}
LUA_VERSION: ${{ inputs.lua }}
- name: Configure
run: mkdir build && cd build && cmake $CMAKE_ARGS ../Nominatim

View File

@@ -15,9 +15,7 @@ runs:
- name: Remove existing PostgreSQL
run: |
sudo apt-get purge -yq postgresql*
sudo apt install curl ca-certificates gnupg
curl https://www.postgresql.org/media/keys/ACCC4CF8.asc | gpg --dearmor | sudo tee /etc/apt/trusted.gpg.d/apt.postgresql.org.gpg >/dev/null
sudo sh -c 'echo "deb https://apt.postgresql.org/pub/repos/apt $(lsb_release -cs)-pgdg main" > /etc/apt/sources.list.d/pgdg.list'
sudo sh -c 'echo "deb http://apt.postgresql.org/pub/repos/apt $(lsb_release -cs)-pgdg main" > /etc/apt/sources.list.d/pgdg.list'
sudo apt-get update -qq
shell: bash

View File

@@ -37,15 +37,20 @@ jobs:
needs: create-archive
strategy:
matrix:
ubuntu: [20, 22]
ubuntu: [18, 20, 22]
include:
- ubuntu: 18
postgresql: 9.6
postgis: 2.5
pytest: pytest
php: 7.2
- ubuntu: 20
postgresql: 13
postgis: 3
pytest: py.test-3
php: 7.4
- ubuntu: 22
postgresql: 15
postgresql: 14
postgis: 3
pytest: py.test-3
php: 8.1
@@ -64,10 +69,8 @@ jobs:
uses: shivammathur/setup-php@v2
with:
php-version: ${{ matrix.php }}
tools: phpunit:9, phpcs, composer
tools: phpunit, phpcs, composer
ini-values: opcache.jit=disable
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- uses: actions/setup-python@v4
with:
@@ -97,22 +100,18 @@ jobs:
- name: Install latest pylint/mypy
run: pip3 install -U pylint mypy types-PyYAML types-jinja2 types-psycopg2 types-psutil types-requests typing-extensions
if: matrix.ubuntu == 22
- name: PHP linting
run: phpcs --report-width=120 .
working-directory: Nominatim
if: matrix.ubuntu == 22
- name: Python linting
run: pylint nominatim
working-directory: Nominatim
if: matrix.ubuntu == 22
- name: Python static typechecking
run: mypy --strict nominatim
working-directory: Nominatim
if: matrix.ubuntu == 22
- name: PHP unit tests
@@ -266,10 +265,6 @@ jobs:
run: nominatim --version
working-directory: /home/nominatim/nominatim-project
- name: Collect host OS information
run: nominatim admin --collect-os-info
working-directory: /home/nominatim/nominatim-project
- name: Import
run: nominatim import --osm-file ../test.pbf
working-directory: /home/nominatim/nominatim-project

View File

@@ -13,6 +13,6 @@ ignored-classes=NominatimArgs,closing
# 'too-many-ancestors' is triggered already by deriving from UserDict
# 'not-context-manager' disabled because it causes false positives once
# typed Python is enabled. See also https://github.com/PyCQA/pylint/issues/5273
disable=too-few-public-methods,duplicate-code,too-many-ancestors,bad-option-value,no-self-use,not-context-manager,use-dict-literal
disable=too-few-public-methods,duplicate-code,too-many-ancestors,bad-option-value,no-self-use,not-context-manager
good-names=i,x,y,m,fd,db,cc
good-names=i,x,y,fd,db,cc

View File

@@ -19,8 +19,8 @@ list(APPEND CMAKE_MODULE_PATH "${CMAKE_SOURCE_DIR}/cmake")
project(nominatim)
set(NOMINATIM_VERSION_MAJOR 4)
set(NOMINATIM_VERSION_MINOR 2)
set(NOMINATIM_VERSION_PATCH 4)
set(NOMINATIM_VERSION_MINOR 1)
set(NOMINATIM_VERSION_PATCH 1)
set(NOMINATIM_VERSION "${NOMINATIM_VERSION_MAJOR}.${NOMINATIM_VERSION_MINOR}.${NOMINATIM_VERSION_PATCH}")
@@ -63,6 +63,7 @@ if (BUILD_IMPORTER AND BUILD_OSM2PGSQL)
endif()
set(BUILD_TESTS_SAVED "${BUILD_TESTS}")
set(BUILD_TESTS off)
set(WITH_LUA off CACHE BOOL "")
add_subdirectory(osm2pgsql)
set(BUILD_TESTS ${BUILD_TESTS_SAVED})
endif()
@@ -269,12 +270,6 @@ install(FILES settings/env.defaults
settings/import-address.style
settings/import-full.style
settings/import-extratags.style
settings/import-admin.lua
settings/import-street.lua
settings/import-address.lua
settings/import-full.lua
settings/import-extratags.lua
settings/flex-base.lua
settings/icu_tokenizer.yaml
settings/country_settings.yaml
DESTINATION ${NOMINATIM_CONFIGDIR})

View File

@@ -64,39 +64,3 @@ Before submitting a pull request make sure that the tests pass:
cd build
make test
```
## Releases
Nominatim follows semantic versioning. Major releases are done for large changes
that require (or at least strongly recommend) a reimport of the databases.
Minor releases can usually be applied to exisiting databases Patch releases
contain bug fixes only and are released from a separate branch where the
relevant changes are cherry-picked from the master branch.
Checklist for releases:
* [ ] increase version in `nominatim/version.py` and CMakeLists.txt
* [ ] update `ChangeLog` (copy information from patch releases from release branch)
* [ ] complete `docs/admin/Migration.md`
* [ ] update EOL dates in `SECURITY.md`
* [ ] commit and make sure CI tests pass
* [ ] test migration
* download, build and import previous version
* migrate using master version
* run updates using master version
* [ ] prepare tarball:
* `git clone --recursive https://github.com/osm-search/Nominatim` (switch to right branch!)
* `rm -r .git* osm2pgsql/.git*`
* copy country data into `data/`
* add version to base directory and package
* [ ] upload tarball to https://nominatim.org
* [ ] prepare documentation
* check out new docs branch
* change git checkout instructions to tarball download instructions or adapt version on existing ones
* build documentation and copy to https://github.com/osm-search/nominatim-org-site
* add new version to history
* [ ] check release tarball
* download tarball as per new documentation instructions
* compile and import Nominatim
* run `nominatim --version` to confirm correct version
* [ ] tag new release and add a release on github.com

View File

@@ -1,45 +1,7 @@
4.2.4
* fix a potential SQL injection in 'nominatim admin --collect-os-info'
* fix compatibility issue with PostGIS 3.4
4.1.1
4.2.3
* fix deletion handling for 'nominatim add-data'
* adapt place_force_delete() to new deletion handling
* flex style: avoid dropping of postcode areas
* fix update errors on address interpolation handling
4.2.2
* extend flex-style library to fully support all default styles
* fix handling of Hebrew aleph
* do not assign postcodes to rivers
* fix string matching in PHP code
* update osm2pgsql (various updates to flex)
* fix slow query when deleting places on update
* fix CLI details query
* fix recalculation of importance values
* fix polygon simplification in reverse results
* add class/type information to reverse geocodejson result
* minor improvements to default tokenizer configuration
* various smaller fixes to documentation
4.2.1
* fix XSS vulnerability in debug view
4.2.0
* add experimental support for osm2pgsql flex style
* introduce secondary importance value to be retrieved from a raster data file
(currently still unused, to replace address importance, thanks to @tareqpi)
* add new report tool `nominatim admin --collect-os-info`
(thanks @micahcochran, @tareqpi)
* reorganise index to improve lookup performance and size
* run index creation after import in parallel
* run ANALYZE more selectively to speed up continuation of indexing
* fix crash on update when addr:interpolation receives an illegal value
* fix minimum number of retrieved results to be at least 10
* fix minimum number of retrived results to be at least 10
* fix search for combinations of special term + name (e.g Hotel Bellevue)
* do not return interpolations without a parent street on reverse search
* improve invalidation of linked places on updates
@@ -47,13 +9,8 @@
* make sure socket timeouts are respected during replication
(working around a bug in some versions of pyosmium)
* update bundled osm2pgsql to 1.7.1
* add support for PostgreSQL 15
* typing fixes to work with latest type annotations from typeshed
* smaller improvements to documentation (thanks to @mausch)
4.1.1
* fix XSS vulnerability in debug view
* smaller improvements to documention (thanks to @mausch)
4.1.0
@@ -91,10 +48,6 @@
* add setup instructions for updates and systemd
* drop support for PostgreSQL 9.5
4.0.2
* fix XSS vulnerability in debug view
4.0.1
* fix initialisation error in replication script
@@ -133,10 +86,6 @@
* add testing of installation scripts via CI
* drop support for Python < 3.6 and Postgresql < 9.5
3.7.3
* fix XSS vulnerability in debug view
3.7.2
* fix database check for reverse-only imports

View File

@@ -9,11 +9,10 @@ versions.
| Version | End of support for security updates |
| ------- | ----------------------------------- |
| 4.2.x | 2024-11-24 |
| 4.1.x | 2024-08-05 |
| 4.0.x | 2023-11-02 |
| 3.7.x | 2023-04-05 |
| 3.6.x | 2022-12-12 |
| 3.5.x | 2022-06-05 |
## Reporting a Vulnerability

View File

@@ -1,6 +1,6 @@
# Install Nominatim in a virtual machine for development and testing
This document describes how you can install Nominatim inside a Ubuntu 22
This document describes how you can install Nominatim inside a Ubuntu 16
virtual machine on your desktop/laptop (host machine). The goal is to give
you a development environment to easily edit code and run the test suite
without affecting the rest of your system.
@@ -69,7 +69,8 @@ installation.
PHP errors are written to `/var/log/apache2/error.log`.
With `echo` and `var_dump()` you write into the output (HTML/XML/JSON) when
you either add `&debug=1` to the URL.
you either add `&debug=1` to the URL (preferred) or set
`@define('CONST_Debug', true);` in `settings/local.php`.
In the Python BDD test you can use `logger.info()` for temporary debug
statements.
@@ -129,10 +130,6 @@ and then
Yes, Vagrant and Virtualbox can be installed on MS Windows just fine. You need a 64bit
version of Windows.
##### Will it run on Apple Silicon?
You might need to replace Virtualbox with [Parallels](https://www.parallels.com/products/desktop/).
There is no free/open source version of Parallels.
##### Why Monaco, can I use another country?
@@ -144,12 +141,11 @@ No. Long running Nominatim installations will differ once new import features (o
bug fixes) get added since those usually only get applied to new/changed data.
Also this document skips the optional Wikipedia data import which affects ranking
of search results. See [Nominatim installation](https://nominatim.org/release-docs/latest/admin/Installation)
for details.
of search results. See [Nominatim installation](https://nominatim.org/release-docs/latest/admin/Installation) for details.
##### Why Ubuntu? Can I test CentOS/Fedora/CoreOS/FreeBSD?
There used to be a Vagrant script for CentOS available, but the Nominatim directory
There is a Vagrant script for CentOS available, but the Nominatim directory
isn't symlinked/mounted to the host which makes development trickier. We used
it mainly for debugging installation with SELinux.
@@ -158,17 +154,14 @@ are slightly different, e.g. the name of the package manager, Apache2 package
name, location of files. We chose Ubuntu because that is closest to the
nominatim.openstreetmap.org production environment.
You can configure/download other Vagrant boxes from
[https://app.vagrantup.com/boxes/search](https://app.vagrantup.com/boxes/search).
You can configure/download other Vagrant boxes from [https://app.vagrantup.com/boxes/search](https://app.vagrantup.com/boxes/search).
##### How can I connect to an existing database?
Let's say you have a Postgres database named `nominatim_it` on server `your-server.com`
and port `5432`. The Postgres username is `postgres`. You can edit the `.env` in your
project directory and point Nominatim to it.
NOMINATIM_DATABASE_DSN="pgsql:host=your-server.com;port=5432;user=postgres;dbname=nominatim_it
Let's say you have a Postgres database named `nominatim_it` on server `your-server.com` and port `5432`. The Postgres username is `postgres`. You can edit `settings/local.php` and point Nominatim to it.
pgsql:host=your-server.com;port=5432;user=postgres;dbname=nominatim_it
No data import or restarting necessary.
If the Postgres installation is behind a firewall, you can try
@@ -176,12 +169,11 @@ If the Postgres installation is behind a firewall, you can try
ssh -L 9999:localhost:5432 your-username@your-server.com
inside the virtual machine. It will map the port to `localhost:9999` and then
you edit `.env` file with
you edit `settings/local.php` with
NOMINATIM_DATABASE_DSN="pgsql:host=localhost;port=9999;user=postgres;dbname=nominatim_it"
@define('CONST_Database_DSN', 'pgsql:host=localhost;port=9999;user=postgres;dbname=nominatim_it');
To access postgres directly remember to specify the hostname,
e.g. `psql --host localhost --port 9999 nominatim_it`
To access postgres directly remember to specify the hostname, e.g. `psql --host localhost --port 9999 nominatim_it`
##### My computer is slow and the import takes too long. Can I start the virtual machine "in the cloud"?

View File

@@ -74,15 +74,15 @@ but it will improve the quality of the results if this is installed.
This data is available as a binary download. Put it into your project directory:
cd $PROJECT_DIR
wget https://nominatim.org/data/wikimedia-importance.sql.gz
wget https://www.nominatim.org/data/wikimedia-importance.sql.gz
The file is about 400MB and adds around 4GB to the Nominatim database.
!!! tip
If you forgot to download the wikipedia rankings, then you can
also add importances after the import. Download the SQL files, then
run `nominatim refresh --wiki-data --importance`. Updating
importances for a planet will take a couple of hours.
If you forgot to download the wikipedia rankings, you can also add
importances after the import. Download the files, then run
`nominatim refresh --wiki-data --importance`. Updating importances for
a planet can take a couple of hours.
### External postcodes
@@ -92,8 +92,8 @@ and the UK (using the [CodePoint OpenData set](https://osdatahub.os.uk/downloads
This data can be optionally downloaded into the project directory:
cd $PROJECT_DIR
wget https://nominatim.org/data/gb_postcodes.csv.gz
wget https://nominatim.org/data/us_postcodes.csv.gz
wget https://www.nominatim.org/data/gb_postcodes.csv.gz
wget https://www.nominatim.org/data/us_postcodes.csv.gz
You can also add your own custom postcode sources, see
[Customization of postcodes](../customize/Postcodes.md).
@@ -139,7 +139,7 @@ import. So this option is particularly interesting if you plan to transfer the
database or reuse the space later.
!!! warning
The data structure for updates are also required when adding additional data
The datastructure for updates are also required when adding additional data
after the import, for example [TIGER housenumber data](../customize/Tiger.md).
If you plan to use those, you must not use the `--no-updates` parameter.
Do a normal import, add the external data and once you are done with

View File

@@ -135,7 +135,7 @@ git clone --recursive https://github.com/openstreetmap/Nominatim.git
The development version does not include the country grid. Download it separately:
```
wget -O Nominatim/data/country_osm_grid.sql.gz https://nominatim.org/data/country_grid.sql.gz
wget -O Nominatim/data/country_osm_grid.sql.gz https://www.nominatim.org/data/country_grid.sql.gz
```
### Building Nominatim

View File

@@ -59,27 +59,3 @@ suited for these kinds of queries.
That said if you installed your own Nominatim instance you can use the
`nominatim export` PHP script as basis to return such lists.
#### 7. My result has a wrong postcode. Where does it come from?
Most places in OSM don't have a postcode, so Nominatim tries to interpolate
one. It first look at all the places that make up the address of the place.
If one of them has a postcode defined, this is the one to be used. When
none of the address parts has a postcode either, Nominatim interpolates one
from the surrounding objects. If the postcode is for your result is one, then
most of the time there is an OSM object with the wrong postcode nearby.
To find the bad postcode, go to
[https://nominatim.openstreetmap.org](https://nominatim.openstreetmap.org)
and search for your place. When you have found it, click on the 'details' link
under the result to go to the details page. There is a field 'Computed Postcode'
which should display the bad postcode. Click on the 'how?' link. A small
explanation text appears. It contains a link to a query for Overpass Turbo.
Click on that and you get a map with all places in the area that have the bad
postcode. If none is displayed, zoom the map out a bit and then click on 'Run'.
Now go to [OpenStreetMap](https://openstreetmap.org) and fix the error you
have just found. It will take at least a day for Nominatim to catch up with
your data fix. Sometimes longer, depending on how much editing activity is in
the area.

View File

@@ -211,8 +211,8 @@ be more than one. The attributes of that element contain:
* `ref` - content of `ref` tag if it exists
* `lat`, `lon` - latitude and longitude of the centroid of the object
* `boundingbox` - comma-separated list of corner coordinates ([see notes](#boundingbox))
* `place_rank` - class [search rank](../customize/Ranking.md#search-rank)
* `address_rank` - place [address rank](../customize/Ranking.md#address-rank)
* `place_rank` - class [search rank](../customize/Ranking#search-rank)
* `address_rank` - place [address rank](../customize/Ranking#address-rank)
* `display_name` - full comma-separated address
* `class`, `type` - key and value of the main OSM tag
* `importance` - computed importance rank

View File

@@ -35,7 +35,7 @@ Additional parameters are accepted as listed below.
!!! warning "Deprecation warning"
The reverse API used to allow address lookup for a single OSM object by
its OSM id. This use is now deprecated. Use the [Address Lookup API](Lookup.md)
its OSM id. This use is now deprecated. Use the [Address Lookup API](../Lookup)
instead.
### Output format

View File

@@ -57,11 +57,10 @@ code and message, e.g.
Possible status codes are
| | message | notes |
| --- | ------------------------------ | ----------------------------------------------------------------- |
| 700 | "No database" | connection failed |
| 701 | "Module failed" | database could not load nominatim.so |
| 702 | "Module call failed" | nominatim.so loaded but calling a function failed |
| 703 | "Query failed" | test query against a database table failed |
| 704 | "No value" | test query worked but returned no results |
| 705 | "Import date is not available" | No import dates were returned (enabling replication can fix this) |
| | message | notes |
|-----|----------------------|---------------------------------------------------|
| 700 | "No database" | connection failed |
| 701 | "Module failed" | database could not load nominatim.so |
| 702 | "Module call failed" | nominatim.so loaded but calling a function failed |
| 703 | "Query failed" | test query against a database table failed |
| 704 | "No value" | test query worked but returned no results |

View File

@@ -1,49 +0,0 @@
## Importance
Search requests can yield multiple results which match equally well with
the original query. In such case Nominatim needs to order the results
according to a different criterion: importance. This is a measure for how
likely it is that a user will search for a given place. This section explains
the sources Nominatim uses for computing importance of a place and how to
customize them.
### How importance is computed
The main value for importance is derived from page ranking values for Wikipedia
pages for a place. For places that do not have their own
Wikipedia page, a formula is used that derives a static importance from the
places [search rank](../customize/Ranking.md#search-rank).
In a second step, a secondary importance value is added which is meant to
represent how well-known the general area is where the place is located. It
functions as a tie-breaker between places with very similar primary
importance values.
nominatim.org has preprocessed importance tables for the
[primary Wikipedia rankings](https://nominatim.org/data/wikimedia-importance.sql.gz)
and for a secondary importance based on the number of tile views on openstreetmap.org.
### Customizing secondary importance
The secondary importance is implemented as a simple
[Postgis raster](https://postgis.net/docs/raster.html) table, where Nominatim
looks up the value for the coordinates of the centroid of a place. You can
provide your own secondary importance raster in form of an SQL file named
`secondary_importance.sql.gz` in your project directory.
The SQL file needs to drop and (re)create a table `secondary_importance` which
must as a minimum contain a column `rast` of type `raster`. The raster must
be in EPSG:4326 and contain 16bit unsigned ints
(`raster_constraint_pixel_types(rast) = '{16BUI}'). Any other columns in the
table will be ignored. You must furthermore create an index as follows:
```
CREATE INDEX ON secondary_importance USING gist(ST_ConvexHull(gist))
```
The following raster2pgsql command will create a table that conforms to
the requirements:
```
raster2pgsql -I -C -Y -d -t 128x128 input.tiff public.secondary_importance
```

View File

@@ -148,29 +148,6 @@ Setting this option to 'yes' means that Nominatim skips reindexing of contained
objects when the area becomes too large.
#### NOMINATIM_UPDATE_FORWARD_DEPENDENCIES
| Summary | |
| -------------- | --------------------------------------------------- |
| **Description:** | Forward geometry changes to dependet objects |
| **Format:** | bool |
| **Default:** | no |
| **Comment:** | EXPERT ONLY. Must not be enabled after import. |
The geometry of OSM ways and relations may change when a node that is part
of the object is moved around. These changes are not propagated per default.
The geometry of ways/relations is only updated the next time that the object
itself is touched. When enabling this option, then dependent objects will
be marked for update when one of its member objects changes.
Enabling this option may slow down updates significantly.
!!! warning
If you want to enable this option, it must be set already on import.
Do not enable this option on an existing database that was imported with
NOMINATIM_UPDATE_FORWARD_DEPENDENCIES=no.
Updates will become unusably slow.
#### NOMINATIM_LANGUAGES
| Summary | |
@@ -666,7 +643,7 @@ The entries in the log file have the following format:
<request time> <execution time in s> <number of results> <type> "<query string>"
Request time is the time when the request was started. The execution time is
given in seconds and corresponds to the time the query took executing in PHP.
given in ms and corresponds to the time the query took executing in PHP.
type contains the name of the endpoint used.
Can be used as the same time as NOMINATIM_LOG_DB.

View File

@@ -213,15 +213,6 @@ The following is a list of sanitizers that are shipped with Nominatim.
rendering:
heading_level: 6
##### clean-tiger-tags
::: nominatim.tokenizer.sanitizers.clean_tiger_tags
selection:
members: False
rendering:
heading_level: 6
#### Token Analysis

View File

@@ -30,7 +30,6 @@ nav:
- 'Configuration Settings': 'customize/Settings.md'
- 'Per-Country Data': 'customize/Country-Settings.md'
- 'Place Ranking' : 'customize/Ranking.md'
- 'Importance' : 'customize/Importance.md'
- 'Tokenizers' : 'customize/Tokenizers.md'
- 'Special Phrases': 'customize/Special-Phrases.md'
- 'External data: US housenumbers from TIGER': 'customize/Tiger.md'

View File

@@ -135,7 +135,7 @@ class Debug
public static function printSQL($sSQL)
{
echo '<p><tt><font color="#aaa">'.htmlspecialchars($sSQL, ENT_QUOTES | ENT_SUBSTITUTE | ENT_HTML401).'</font></tt></p>'."\n";
echo '<p><tt><font color="#aaa">'.$sSQL.'</font></tt></p>'."\n";
}
private static function outputVar($mVar, $sPreNL)
@@ -178,12 +178,11 @@ class Debug
}
if (is_string($mVar)) {
$sOut = "'$mVar'";
} else {
$sOut = (string)$mVar;
echo "'$mVar'";
return strlen($mVar) + 2;
}
echo htmlspecialchars($sOut, ENT_QUOTES | ENT_SUBSTITUTE | ENT_HTML401);
return strlen($sOut);
echo (string)$mVar;
return strlen((string)$mVar);
}
}

View File

@@ -874,7 +874,7 @@ class Geocode
$iCountWords = 0;
$sAddress = $aResult['langaddress'];
foreach ($aRecheckWords as $i => $sWord) {
if (grapheme_stripos($sAddress, $sWord)!==false) {
if (stripos($sAddress, $sWord)!==false) {
$iCountWords++;
if (preg_match('/(^|,)\s*'.preg_quote($sWord, '/').'\s*(,|$)/', $sAddress)) {
$iCountWords += 0.1;

View File

@@ -187,12 +187,12 @@ class PlaceLookup
return null;
}
$aResults = $this->lookup(array($iPlaceID => new Result($iPlaceID)), 0, 30, true);
$aResults = $this->lookup(array($iPlaceID => new Result($iPlaceID)));
return empty($aResults) ? null : reset($aResults);
}
public function lookup($aResults, $iMinRank = 0, $iMaxRank = 30, $bAllowLinked = false)
public function lookup($aResults, $iMinRank = 0, $iMaxRank = 30)
{
Debug::newFunction('Place lookup');
@@ -247,9 +247,7 @@ class PlaceLookup
if ($this->sAllowedTypesSQLList) {
$sSQL .= 'AND placex.class in '.$this->sAllowedTypesSQLList;
}
if (!$bAllowLinked) {
$sSQL .= ' AND linked_place_id is null ';
}
$sSQL .= ' AND linked_place_id is null ';
$sSQL .= ' GROUP BY ';
$sSQL .= ' osm_type, ';
$sSQL .= ' osm_id, ';
@@ -524,7 +522,12 @@ class PlaceLookup
// Get the bounding box and outline polygon
$sSQL = 'select place_id,0 as numfeatures,st_area(geometry) as area,';
$sSQL .= ' ST_Y(centroid) as centrelat, ST_X(centroid) as centrelon,';
if ($fLonReverse != null && $fLatReverse != null) {
$sSQL .= ' ST_Y(closest_point) as centrelat,';
$sSQL .= ' ST_X(closest_point) as centrelon,';
} else {
$sSQL .= ' ST_Y(centroid) as centrelat, ST_X(centroid) as centrelon,';
}
$sSQL .= ' ST_YMin(geometry) as minlat,ST_YMax(geometry) as maxlat,';
$sSQL .= ' ST_XMin(geometry) as minlon,ST_XMax(geometry) as maxlon';
if ($this->bIncludePolygonAsGeoJSON) {
@@ -539,21 +542,19 @@ class PlaceLookup
if ($this->bIncludePolygonAsText) {
$sSQL .= ',ST_AsText(geometry) as astext';
}
$sSQL .= ' FROM (SELECT place_id';
if ($fLonReverse != null && $fLatReverse != null) {
$sSQL .= ',CASE WHEN (class = \'highway\') AND (ST_GeometryType(geometry) = \'ST_LineString\') THEN ';
$sSQL .=' ST_ClosestPoint(geometry, ST_SetSRID(ST_Point('.$fLatReverse.','.$fLonReverse.'),4326))';
$sSQL .=' ELSE centroid END AS centroid';
$sFrom = ' from (SELECT * , CASE WHEN (class = \'highway\') AND (ST_GeometryType(geometry) = \'ST_LineString\') THEN ';
$sFrom .=' ST_ClosestPoint(geometry, ST_SetSRID(ST_Point('.$fLatReverse.','.$fLonReverse.'),4326))';
$sFrom .=' ELSE centroid END AS closest_point';
$sFrom .= ' from placex where place_id = '.$iPlaceID.') as plx';
} else {
$sSQL .= ',centroid';
$sFrom = ' from placex where place_id = '.$iPlaceID;
}
if ($this->fPolygonSimplificationThreshold > 0) {
$sSQL .= ',ST_SimplifyPreserveTopology(geometry,'.$this->fPolygonSimplificationThreshold.') as geometry';
$sSQL .= ' from (select place_id,centroid,ST_SimplifyPreserveTopology(geometry,'.$this->fPolygonSimplificationThreshold.') as geometry'.$sFrom.') as plx';
} else {
$sSQL .= ',geometry';
$sSQL .= $sFrom;
}
$sSQL .= ' FROM placex where place_id = '.$iPlaceID.') as plx';
$aPointPolygon = $this->oDB->getRow($sSQL, null, 'Could not get outline');

View File

@@ -189,16 +189,14 @@ class ReverseGeocode
$sSQL .= '(select place_id, parent_place_id, rank_address, rank_search, country_code, geometry';
$sSQL .= ' FROM placex';
$sSQL .= ' WHERE ST_GeometryType(geometry) in (\'ST_Polygon\', \'ST_MultiPolygon\')';
// Ensure that query planner doesn't use the index on rank_search.
$sSQL .= ' AND coalesce(rank_search, 0) between 5 and ' .$iMaxRank;
$sSQL .= ' AND rank_address between 4 and 25'; // needed for index selection
$sSQL .= ' AND rank_address Between 5 AND ' .$iMaxRank;
$sSQL .= ' AND geometry && '.$sPointSQL;
$sSQL .= ' AND type != \'postcode\' ';
$sSQL .= ' AND name is not null';
$sSQL .= ' AND indexed_status = 0 and linked_place_id is null';
$sSQL .= ' ORDER BY rank_search DESC LIMIT 50 ) as a';
$sSQL .= ' WHERE ST_Contains(geometry, '.$sPointSQL.' )';
$sSQL .= ' ORDER BY rank_search DESC LIMIT 1';
$sSQL .= ' ORDER BY rank_address DESC LIMIT 50 ) as a';
$sSQL .= ' WHERE ST_CONTAINS(geometry, '.$sPointSQL.' )';
$sSQL .= ' ORDER BY rank_address DESC LIMIT 1';
Debug::printSQL($sSQL);
$aPoly = $this->oDB->getRow($sSQL, null, 'Could not determine polygon containing the point.');
@@ -210,7 +208,7 @@ class ReverseGeocode
$iRankSearch = $aPoly['rank_search'];
$iPlaceID = $aPoly['place_id'];
if ($iRankSearch != $iMaxRank) {
if ($iRankAddress != $iMaxRank) {
$sSQL = 'SELECT place_id FROM ';
$sSQL .= '(SELECT place_id, rank_search, country_code, geometry,';
$sSQL .= ' ST_distance('.$sPointSQL.', geometry) as distance';

View File

@@ -36,9 +36,6 @@ if (empty($aPlace)) {
$aFilteredPlaces['properties']['geocoding']['osm_id'] = $aPlace['osm_id'];
}
$aFilteredPlaces['properties']['geocoding']['osm_key'] = $aPlace['class'];
$aFilteredPlaces['properties']['geocoding']['osm_value'] = $aPlace['type'];
$aFilteredPlaces['properties']['geocoding']['type'] = addressRankToGeocodeJsonType($aPlace['rank_address']);
$aFilteredPlaces['properties']['geocoding']['accuracy'] = (int) $fDistance;

View File

@@ -100,55 +100,32 @@ LANGUAGE plpgsql STABLE;
CREATE OR REPLACE FUNCTION compute_importance(extratags HSTORE,
country_code varchar(2),
rank_search SMALLINT,
centroid GEOMETRY)
osm_type varchar(1), osm_id BIGINT)
RETURNS place_importance
AS $$
DECLARE
match RECORD;
result place_importance;
osm_views_exists BIGINT;
views BIGINT;
BEGIN
-- add importance by wikipedia article if the place has one
FOR match IN
SELECT * FROM get_wikipedia_match(extratags, country_code)
WHERE language is not NULL
FOR match IN SELECT * FROM get_wikipedia_match(extratags, country_code)
WHERE language is not NULL
LOOP
result.importance := match.importance;
result.wikipedia := match.language || ':' || match.title;
RETURN result;
END LOOP;
-- Nothing? Then try with the wikidata tag.
IF result.importance is null AND extratags ? 'wikidata' THEN
IF extratags ? 'wikidata' THEN
FOR match IN SELECT * FROM wikipedia_article
WHERE wd_page_title = extratags->'wikidata'
ORDER BY language = 'en' DESC, langcount DESC LIMIT 1
LOOP
ORDER BY language = 'en' DESC, langcount DESC LIMIT 1 LOOP
result.importance := match.importance;
result.wikipedia := match.language || ':' || match.title;
RETURN result;
END LOOP;
END IF;
-- Still nothing? Fall back to a default.
IF result.importance is null THEN
result.importance := 0.75001 - (rank_search::float / 40);
END IF;
{% if 'secondary_importance' in db.tables %}
FOR match IN
SELECT ST_Value(rast, centroid) as importance
FROM secondary_importance
WHERE ST_Intersects(ST_ConvexHull(rast), centroid) LIMIT 1
LOOP
-- Secondary importance as tie breaker with 0.0001 weight.
result.importance := result.importance + match.importance::float / 655350000;
END LOOP;
{% endif %}
RETURN result;
RETURN null;
END;
$$
LANGUAGE plpgsql;

View File

@@ -52,9 +52,7 @@ BEGIN
IF parent_place_id is null THEN
FOR location IN SELECT place_id FROM placex
WHERE ST_DWithin(geom, placex.geometry, 0.001)
and placex.rank_search = 26
and placex.osm_type = 'W' -- needed for index selection
WHERE ST_DWithin(geom, placex.geometry, 0.001) and placex.rank_search = 26
ORDER BY CASE WHEN ST_GeometryType(geom) = 'ST_Line' THEN
(ST_distance(placex.geometry, ST_LineInterpolatePoint(geom,0))+
ST_distance(placex.geometry, ST_LineInterpolatePoint(geom,0.5))+
@@ -164,7 +162,7 @@ DECLARE
newend INTEGER;
moddiff SMALLINT;
linegeo GEOMETRY;
splitpoint FLOAT;
splitline GEOMETRY;
sectiongeo GEOMETRY;
postcode TEXT;
stepmod SMALLINT;
@@ -223,27 +221,15 @@ BEGIN
FROM placex, generate_series(1, array_upper(waynodes, 1)) nodeidpos
WHERE osm_type = 'N' and osm_id = waynodes[nodeidpos]::BIGINT
and address is not NULL and address ? 'housenumber'
and ST_Distance(NEW.linegeo, geometry) < 0.0005
ORDER BY nodeidpos
LOOP
{% if debug %}RAISE WARNING 'processing point % (%)', nextnode.hnr, ST_AsText(nextnode.geometry);{% endif %}
IF linegeo is null THEN
linegeo := NEW.linegeo;
ELSE
splitpoint := ST_LineLocatePoint(linegeo, nextnode.geometry);
IF splitpoint = 0 THEN
-- Corner case where the splitpoint falls on the first point
-- and thus would not return a geometry. Skip that section.
sectiongeo := NULL;
ELSEIF splitpoint = 1 THEN
-- Point is at the end of the line.
sectiongeo := linegeo;
linegeo := NULL;
ELSE
-- Split the line.
sectiongeo := ST_LineSubstring(linegeo, 0, splitpoint);
linegeo := ST_LineSubstring(linegeo, splitpoint, 1);
END IF;
splitline := ST_Split(ST_Snap(linegeo, nextnode.geometry, 0.0005), nextnode.geometry);
sectiongeo := ST_GeometryN(splitline, 1);
linegeo := ST_GeometryN(splitline, 2);
END IF;
IF prevnode.hnr is not null
@@ -251,9 +237,6 @@ BEGIN
-- regularly mapped housenumbers.
-- (Conveniently also fails if one of the house numbers is not a number.)
and abs(prevnode.hnr - nextnode.hnr) > NEW.step
-- If the interpolation geometry is broken or two nodes are at the
-- same place, then splitting might produce a point. Ignore that.
and ST_GeometryType(sectiongeo) = 'ST_LineString'
THEN
IF prevnode.hnr < nextnode.hnr THEN
startnumber := prevnode.hnr;
@@ -315,12 +298,12 @@ BEGIN
NEW.address, postcode,
NEW.country_code, NEW.geometry_sector, 0);
END IF;
END IF;
-- early break if we are out of line string,
-- might happen when a line string loops back on itself
IF linegeo is null or ST_GeometryType(linegeo) != 'ST_LineString' THEN
RETURN NEW;
-- early break if we are out of line string,
-- might happen when a line string loops back on itself
IF ST_GeometryType(linegeo) != 'ST_LineString' THEN
RETURN NEW;
END IF;
END IF;
prevnode := nextnode;

View File

@@ -34,11 +34,6 @@ BEGIN
RETURN null;
END IF;
-- Remove the place from the list of places to be deleted
DELETE FROM place_to_be_deleted pdel
WHERE pdel.osm_type = NEW.osm_type and pdel.osm_id = NEW.osm_id
and pdel.class = NEW.class;
-- Have we already done this place?
SELECT * INTO existing
FROM place
@@ -47,6 +42,8 @@ BEGIN
{% if debug %}RAISE WARNING 'Existing: %',existing.osm_id;{% endif %}
-- Handle a place changing type by removing the old data.
-- (This trigger is executed BEFORE INSERT of the NEW tuple.)
IF existing.osm_type IS NULL THEN
DELETE FROM place where osm_type = NEW.osm_type and osm_id = NEW.osm_id and class = NEW.class;
END IF;
@@ -190,11 +187,15 @@ BEGIN
END IF;
{% endif %}
IF existingplacex.osm_type is not NULL THEN
-- Mark any existing place for delete in the placex table
UPDATE placex SET indexed_status = 100
WHERE placex.osm_type = NEW.osm_type and placex.osm_id = NEW.osm_id
and placex.class = NEW.class and placex.type = NEW.type;
IF existing.osm_type IS NOT NULL THEN
-- Pathological case caused by the triggerless copy into place during initial import
-- force delete even for large areas, it will be reinserted later
UPDATE place SET geometry = ST_SetSRID(ST_Point(0,0), 4326)
WHERE osm_type = NEW.osm_type and osm_id = NEW.osm_id
and class = NEW.class and type = NEW.type;
DELETE FROM place
WHERE osm_type = NEW.osm_type and osm_id = NEW.osm_id
and class = NEW.class and type = NEW.type;
END IF;
-- Process it as a new insertion
@@ -205,27 +206,6 @@ BEGIN
{% if debug %}RAISE WARNING 'insert done % % % % %',NEW.osm_type,NEW.osm_id,NEW.class,NEW.type,NEW.name;{% endif %}
IF existing.osm_type is not NULL THEN
-- If there is already an entry in place, just update that, if necessary.
IF coalesce(existing.name, ''::hstore) != coalesce(NEW.name, ''::hstore)
or coalesce(existing.address, ''::hstore) != coalesce(NEW.address, ''::hstore)
or coalesce(existing.extratags, ''::hstore) != coalesce(NEW.extratags, ''::hstore)
or coalesce(existing.admin_level, 15) != coalesce(NEW.admin_level, 15)
or existing.geometry::text != NEW.geometry::text
THEN
UPDATE place
SET name = NEW.name,
address = NEW.address,
extratags = NEW.extratags,
admin_level = NEW.admin_level,
geometry = NEW.geometry
WHERE osm_type = NEW.osm_type and osm_id = NEW.osm_id
and class = NEW.class and type = NEW.type;
END IF;
RETURN NULL;
END IF;
RETURN NEW;
END IF;
@@ -341,79 +321,35 @@ BEGIN
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE FUNCTION place_delete()
RETURNS TRIGGER
AS $$
DECLARE
deferred BOOLEAN;
has_rank BOOLEAN;
BEGIN
{% if debug %}RAISE WARNING 'Delete for % % %/%', OLD.osm_type, OLD.osm_id, OLD.class, OLD.type;{% endif %}
deferred := ST_IsValid(OLD.geometry) and ST_Area(OLD.geometry) > 2;
IF deferred THEN
SELECT bool_or(not (rank_address = 0 or rank_address > 25)) INTO deferred
FROM placex
WHERE osm_type = OLD.osm_type and osm_id = OLD.osm_id
and class = OLD.class and type = OLD.type;
{% if debug %}RAISE WARNING 'delete: % % % %',OLD.osm_type,OLD.osm_id,OLD.class,OLD.type;{% endif %}
-- deleting large polygons can have a massive effect on the system - require manual intervention to let them through
IF st_area(OLD.geometry) > 2 and st_isvalid(OLD.geometry) THEN
SELECT bool_or(not (rank_address = 0 or rank_address > 25)) as ranked FROM placex WHERE osm_type = OLD.osm_type and osm_id = OLD.osm_id and class = OLD.class and type = OLD.type INTO has_rank;
IF has_rank THEN
insert into import_polygon_delete (osm_type, osm_id, class, type) values (OLD.osm_type,OLD.osm_id,OLD.class,OLD.type);
RETURN NULL;
END IF;
END IF;
INSERT INTO place_to_be_deleted (osm_type, osm_id, class, type, deferred)
VALUES(OLD.osm_type, OLD.osm_id, OLD.class, OLD.type, deferred);
-- mark for delete
UPDATE placex set indexed_status = 100 where osm_type = OLD.osm_type and osm_id = OLD.osm_id and class = OLD.class and type = OLD.type;
RETURN NULL;
-- interpolations are special
IF OLD.osm_type='W' and OLD.class = 'place' and OLD.type = 'houses' THEN
UPDATE location_property_osmline set indexed_status = 100 where osm_id = OLD.osm_id; -- osm_id = wayid (=old.osm_id)
END IF;
RETURN OLD;
END;
$$ LANGUAGE plpgsql;
CREATE OR REPLACE FUNCTION flush_deleted_places()
RETURNS INTEGER
AS $$
BEGIN
-- deleting large polygons can have a massive effect on the system - require manual intervention to let them through
INSERT INTO import_polygon_delete (osm_type, osm_id, class, type)
SELECT osm_type, osm_id, class, type FROM place_to_be_deleted WHERE deferred;
-- delete from place table
ALTER TABLE place DISABLE TRIGGER place_before_delete;
DELETE FROM place USING place_to_be_deleted
WHERE place.osm_type = place_to_be_deleted.osm_type
and place.osm_id = place_to_be_deleted.osm_id
and place.class = place_to_be_deleted.class
and place.type = place_to_be_deleted.type
and not deferred;
ALTER TABLE place ENABLE TRIGGER place_before_delete;
-- Mark for delete in the placex table
UPDATE placex SET indexed_status = 100 FROM place_to_be_deleted
WHERE placex.osm_type = 'N' and place_to_be_deleted.osm_type = 'N'
and placex.osm_id = place_to_be_deleted.osm_id
and placex.class = place_to_be_deleted.class
and placex.type = place_to_be_deleted.type
and not deferred;
UPDATE placex SET indexed_status = 100 FROM place_to_be_deleted
WHERE placex.osm_type = 'W' and place_to_be_deleted.osm_type = 'W'
and placex.osm_id = place_to_be_deleted.osm_id
and placex.class = place_to_be_deleted.class
and placex.type = place_to_be_deleted.type
and not deferred;
UPDATE placex SET indexed_status = 100 FROM place_to_be_deleted
WHERE placex.osm_type = 'R' and place_to_be_deleted.osm_type = 'R'
and placex.osm_id = place_to_be_deleted.osm_id
and placex.class = place_to_be_deleted.class
and placex.type = place_to_be_deleted.type
and not deferred;
-- Mark for delete in interpolations
UPDATE location_property_osmline SET indexed_status = 100 FROM place_to_be_deleted
WHERE place_to_be_deleted.osm_type = 'W'
and place_to_be_deleted.class = 'place'
and place_to_be_deleted.type = 'houses'
and location_property_osmline.osm_id = place_to_be_deleted.osm_id
and not deferred;
-- Clear todo list.
TRUNCATE TABLE place_to_be_deleted;
RETURN NULL;
END;
$$ LANGUAGE plpgsql;
$$
LANGUAGE plpgsql;

View File

@@ -107,17 +107,12 @@ LANGUAGE plpgsql STABLE;
CREATE OR REPLACE FUNCTION find_associated_street(poi_osm_type CHAR(1),
poi_osm_id BIGINT,
bbox GEOMETRY)
poi_osm_id BIGINT)
RETURNS BIGINT
AS $$
DECLARE
location RECORD;
parent RECORD;
result BIGINT;
distance FLOAT;
new_distance FLOAT;
waygeom GEOMETRY;
BEGIN
FOR location IN
SELECT members FROM planet_osm_rels
@@ -128,34 +123,19 @@ BEGIN
FOR i IN 1..array_upper(location.members, 1) BY 2 LOOP
IF location.members[i+1] = 'street' THEN
FOR parent IN
SELECT place_id, geometry
FROM placex
SELECT place_id from placex
WHERE osm_type = upper(substring(location.members[i], 1, 1))::char(1)
and osm_id = substring(location.members[i], 2)::bigint
and name is not null
and rank_search between 26 and 27
LOOP
-- Find the closest 'street' member.
-- Avoid distance computation for the frequent case where there is
-- only one street member.
IF waygeom is null THEN
result := parent.place_id;
waygeom := parent.geometry;
ELSE
distance := coalesce(distance, ST_Distance(waygeom, bbox));
new_distance := ST_Distance(parent.geometry, bbox);
IF new_distance < distance THEN
distance := new_distance;
result := parent.place_id;
waygeom := parent.geometry;
END IF;
END IF;
RETURN parent.place_id;
END LOOP;
END IF;
END LOOP;
END LOOP;
RETURN result;
RETURN NULL;
END;
$$
LANGUAGE plpgsql STABLE;
@@ -182,7 +162,7 @@ BEGIN
{% if debug %}RAISE WARNING 'finding street for % %', poi_osm_type, poi_osm_id;{% endif %}
-- Is this object part of an associatedStreet relation?
parent_place_id := find_associated_street(poi_osm_type, poi_osm_id, bbox);
parent_place_id := find_associated_street(poi_osm_type, poi_osm_id);
IF parent_place_id is null THEN
parent_place_id := find_parent_for_address(token_info, poi_partition, bbox);
@@ -205,7 +185,7 @@ BEGIN
RETURN location.place_id;
END IF;
parent_place_id := find_associated_street('W', location.osm_id, bbox);
parent_place_id := find_associated_street('W', location.osm_id);
END LOOP;
END IF;
@@ -217,7 +197,6 @@ BEGIN
SELECT place_id FROM placex
WHERE bbox && geometry AND _ST_Covers(geometry, ST_Centroid(bbox))
AND rank_address between 5 and 25
AND ST_GeometryType(geometry) in ('ST_Polygon','ST_MultiPolygon')
ORDER BY rank_address desc
LOOP
RETURN location.place_id;
@@ -233,7 +212,6 @@ BEGIN
SELECT place_id FROM placex
WHERE bbox && geometry AND _ST_Covers(geometry, ST_Centroid(bbox))
AND rank_address between 5 and 25
AND ST_GeometryType(geometry) in ('ST_Polygon','ST_MultiPolygon')
ORDER BY rank_address desc
LOOP
RETURN location.place_id;
@@ -297,9 +275,7 @@ BEGIN
-- If extratags has a place tag, look for linked nodes by their place type.
-- Area and node still have to have the same name.
IF bnd.extratags ? 'place' and bnd.extratags->'place' != 'postcode'
and bnd_name is not null
THEN
IF bnd.extratags ? 'place' and bnd_name is not null THEN
FOR linked_placex IN
SELECT * FROM placex
WHERE (position(lower(name->'name') in bnd_name) > 0
@@ -308,6 +284,7 @@ BEGIN
AND placex.osm_type = 'N'
AND (placex.linked_place_id is null or placex.linked_place_id = bnd.place_id)
AND placex.rank_search < 26 -- needed to select the right index
AND placex.type != 'postcode'
AND ST_Covers(bnd.geometry, placex.geometry)
LOOP
{% if debug %}RAISE WARNING 'Found type-matching place node %', linked_placex.osm_id;{% endif %}
@@ -869,8 +846,7 @@ BEGIN
FROM placex
WHERE osm_type = 'R' and class = 'boundary' and type = 'administrative'
and admin_level < NEW.admin_level and admin_level > 3
and rank_address between 1 and 25 -- for index selection
and ST_GeometryType(geometry) in ('ST_Polygon','ST_MultiPolygon') -- for index selection
and rank_address > 0
and geometry && NEW.centroid and _ST_Covers(geometry, NEW.centroid)
ORDER BY admin_level desc LIMIT 1
LOOP
@@ -898,9 +874,8 @@ BEGIN
FROM placex,
LATERAL compute_place_rank(country_code, 'A', class, type,
admin_level, False, null) prank
WHERE class = 'place' and rank_address between 1 and 23
WHERE class = 'place' and rank_address < 24
and prank.address_rank >= NEW.rank_address
and ST_GeometryType(geometry) in ('ST_Polygon','ST_MultiPolygon') -- select right index
and geometry && NEW.geometry
and geometry ~ NEW.geometry -- needed because ST_Relate does not do bbox cover test
and ST_Relate(geometry, NEW.geometry, 'T*T***FF*') -- contains but not equal
@@ -921,8 +896,6 @@ BEGIN
LATERAL compute_place_rank(country_code, 'A', class, type,
admin_level, False, null) prank
WHERE prank.address_rank < 24
and rank_address between 1 and 25 -- select right index
and ST_GeometryType(geometry) in ('ST_Polygon','ST_MultiPolygon') -- select right index
and prank.address_rank >= NEW.rank_address
and geometry && NEW.geometry
and geometry ~ NEW.geometry -- needed because ST_Relate does not do bbox cover test
@@ -943,8 +916,6 @@ BEGIN
LATERAL compute_place_rank(country_code, 'A', class, type,
admin_level, False, null) prank
WHERE osm_type = 'R'
and rank_address between 1 and 25 -- select right index
and ST_GeometryType(geometry) in ('ST_Polygon','ST_MultiPolygon') -- select right index
and ((class = 'place' and prank.address_rank = NEW.rank_address)
or (class = 'boundary' and rank_address = NEW.rank_address))
and geometry && NEW.centroid and _ST_Covers(geometry, NEW.centroid)
@@ -985,7 +956,7 @@ BEGIN
NEW.importance := null;
SELECT wikipedia, importance
FROM compute_importance(NEW.extratags, NEW.country_code, NEW.rank_search, NEW.centroid)
FROM compute_importance(NEW.extratags, NEW.country_code, NEW.osm_type, NEW.osm_id)
INTO NEW.wikipedia,NEW.importance;
{% if debug %}RAISE WARNING 'Importance computed from wikipedia: %', NEW.importance;{% endif %}
@@ -1067,7 +1038,7 @@ BEGIN
IF linked_place is not null THEN
-- Recompute the ranks here as the ones from the linked place might
-- have been shifted to accommodate surrounding boundaries.
SELECT place_id, osm_id, class, type, extratags, rank_search,
SELECT place_id, osm_id, class, type, extratags,
centroid, geometry,
(compute_place_rank(country_code, osm_type, class, type, admin_level,
(extratags->'capital') = 'yes', null)).*
@@ -1108,7 +1079,7 @@ BEGIN
SELECT wikipedia, importance
FROM compute_importance(location.extratags, NEW.country_code,
location.rank_search, NEW.centroid)
'N', location.osm_id)
INTO linked_wikipedia,linked_importance;
-- Use the maximum importance if one could be computed from the linked object.
@@ -1120,7 +1091,7 @@ BEGIN
ELSE
-- No linked place? As a last resort check if the boundary is tagged with
-- a place type and adapt the rank address.
IF NEW.rank_address between 4 and 25 and NEW.extratags ? 'place' THEN
IF NEW.rank_address > 0 and NEW.extratags ? 'place' THEN
SELECT address_rank INTO place_address_level
FROM compute_place_rank(NEW.country_code, 'A', 'place',
NEW.extratags->'place', 0::SMALLINT, False, null);
@@ -1230,11 +1201,7 @@ BEGIN
{% endif %}
END IF;
IF NEW.postcode is null AND NEW.rank_search > 8
AND (NEW.rank_address > 0
OR ST_GeometryType(NEW.geometry) not in ('ST_LineString','ST_MultiLineString')
OR ST_Length(NEW.geometry) < 0.02)
THEN
IF NEW.postcode is null AND NEW.rank_search > 8 THEN
NEW.postcode := get_nearest_postcode(NEW.country_code, NEW.geometry);
END IF;

View File

@@ -273,8 +273,8 @@ BEGIN
END IF;
RETURN ST_Envelope(ST_Collect(
ST_Project(geom::geography, radius, 0.785398)::geometry,
ST_Project(geom::geography, radius, 3.9269908)::geometry));
ST_Project(geom, radius, 0.785398)::geometry,
ST_Project(geom, radius, 3.9269908)::geometry));
END;
$$
LANGUAGE plpgsql IMMUTABLE;
@@ -429,10 +429,9 @@ BEGIN
SELECT osm_type, osm_id, class, type FROM placex WHERE place_id = placeid INTO osmtype, osmid, pclass, ptype;
DELETE FROM import_polygon_delete where osm_type = osmtype and osm_id = osmid and class = pclass and type = ptype;
DELETE FROM import_polygon_error where osm_type = osmtype and osm_id = osmid and class = pclass and type = ptype;
-- force delete by directly entering it into the to-be-deleted table
INSERT INTO place_to_be_deleted (osm_type, osm_id, class, type, deferred)
VALUES(osmtype, osmid, pclass, ptype, false);
PERFORM flush_deleted_places();
-- force delete from place/placex by making it a very small geometry
UPDATE place set geometry = ST_SetSRID(ST_Point(0,0), 4326) where osm_type = osmtype and osm_id = osmid and class = pclass and type = ptype;
DELETE FROM place where osm_type = osmtype and osm_id = osmid and class = pclass and type = ptype;
RETURN TRUE;
END;

View File

@@ -10,86 +10,65 @@
CREATE INDEX IF NOT EXISTS idx_place_addressline_address_place_id
ON place_addressline USING BTREE (address_place_id) {{db.tablespace.search_index}};
---
CREATE INDEX IF NOT EXISTS idx_placex_rank_search
ON placex USING BTREE (rank_search) {{db.tablespace.search_index}};
---
CREATE INDEX IF NOT EXISTS idx_placex_rank_address
ON placex USING BTREE (rank_address) {{db.tablespace.search_index}};
---
CREATE INDEX IF NOT EXISTS idx_placex_parent_place_id
ON placex USING BTREE (parent_place_id) {{db.tablespace.search_index}}
WHERE parent_place_id IS NOT NULL;
---
CREATE INDEX IF NOT EXISTS idx_placex_geometry ON placex
USING GIST (geometry) {{db.tablespace.search_index}};
---
CREATE INDEX IF NOT EXISTS idx_placex_geometry_reverse_lookupPolygon
ON placex USING gist (geometry) {{db.tablespace.search_index}}
WHERE St_GeometryType(geometry) in ('ST_Polygon', 'ST_MultiPolygon')
AND rank_address between 4 and 25 AND type != 'postcode'
AND name is not null AND indexed_status = 0 AND linked_place_id is null;
---
CREATE INDEX IF NOT EXISTS idx_osmline_parent_place_id
ON location_property_osmline USING BTREE (parent_place_id) {{db.tablespace.search_index}}
WHERE parent_place_id is not null;
---
CREATE INDEX IF NOT EXISTS idx_osmline_parent_osm_id
ON location_property_osmline USING BTREE (osm_id) {{db.tablespace.search_index}};
---
CREATE INDEX IF NOT EXISTS idx_postcode_postcode
ON location_postcode USING BTREE (postcode) {{db.tablespace.search_index}};
{% if drop %}
---
DROP INDEX IF EXISTS idx_placex_geometry_address_area_candidates;
DROP INDEX IF EXISTS idx_placex_geometry_buildings;
DROP INDEX IF EXISTS idx_placex_geometry_lower_rank_ways;
DROP INDEX IF EXISTS idx_placex_wikidata;
DROP INDEX IF EXISTS idx_placex_rank_address_sector;
DROP INDEX IF EXISTS idx_placex_rank_boundaries_sector;
{% else %}
-- Indices only needed for updating.
---
{% if not drop %}
CREATE INDEX IF NOT EXISTS idx_placex_pendingsector
ON placex USING BTREE (rank_address,geometry_sector) {{db.tablespace.address_index}}
WHERE indexed_status > 0;
CREATE INDEX IF NOT EXISTS idx_location_area_country_place_id
ON location_area_country USING BTREE (place_id) {{db.tablespace.address_index}};
---
CREATE UNIQUE INDEX IF NOT EXISTS idx_place_osm_unique
ON place USING btree(osm_id, osm_type, class, type) {{db.tablespace.address_index}};
---
-- Table needed for running updates with osm2pgsql on place.
CREATE TABLE IF NOT EXISTS place_to_be_deleted (
osm_type CHAR(1),
osm_id BIGINT,
class TEXT,
type TEXT,
deferred BOOLEAN
);
{% endif %}
-- Indices only needed for search.
{% if 'search_name' in db.tables %}
---
CREATE INDEX IF NOT EXISTS idx_search_name_nameaddress_vector
ON search_name USING GIN (nameaddress_vector) WITH (fastupdate = off) {{db.tablespace.search_index}};
---
CREATE INDEX IF NOT EXISTS idx_search_name_name_vector
ON search_name USING GIN (name_vector) WITH (fastupdate = off) {{db.tablespace.search_index}};
---
CREATE INDEX IF NOT EXISTS idx_search_name_centroid
ON search_name USING GIST (centroid) {{db.tablespace.search_index}};
{% if postgres.has_index_non_key_column %}
---
CREATE INDEX IF NOT EXISTS idx_placex_housenumber
ON placex USING btree (parent_place_id)
INCLUDE (housenumber) {{db.tablespace.search_index}}
WHERE housenumber is not null;
---
CREATE INDEX IF NOT EXISTS idx_osmline_parent_osm_id_with_hnr
ON location_property_osmline USING btree(parent_place_id)
INCLUDE (startnumber, endnumber) {{db.tablespace.search_index}}
WHERE startnumber is not null;
{% endif %}
{% endif %}

View File

@@ -137,9 +137,7 @@ CREATE TABLE place_addressline (
) {{db.tablespace.search_data}};
CREATE INDEX idx_place_addressline_place_id on place_addressline USING BTREE (place_id) {{db.tablespace.search_index}};
--------- PLACEX - storage for all indexed places -----------------
DROP TABLE IF EXISTS placex;
drop table if exists placex;
CREATE TABLE placex (
place_id BIGINT NOT NULL,
parent_place_id BIGINT,
@@ -159,66 +157,20 @@ CREATE TABLE placex (
postcode TEXT,
centroid GEOMETRY(Geometry, 4326)
) {{db.tablespace.search_data}};
CREATE UNIQUE INDEX idx_place_id ON placex USING BTREE (place_id) {{db.tablespace.search_index}};
{% for osm_type in ('N', 'W', 'R') %}
CREATE INDEX idx_placex_osmid_{{osm_type | lower}} ON placex
USING BTREE (osm_id) {{db.tablespace.search_index}}
WHERE osm_type = '{{osm_type}}';
{% endfor %}
-- Usage: - removing linkage status on update
-- - lookup linked places for /details
CREATE INDEX idx_placex_linked_place_id ON placex
USING BTREE (linked_place_id) {{db.tablespace.address_index}}
WHERE linked_place_id IS NOT NULL;
-- Usage: - check that admin boundaries do not overtake each other rank-wise
-- - check that place node in a admin boundary with the same address level
-- - boundary is not completely contained in a place area
-- - parenting of large-area or unparentable features
CREATE INDEX idx_placex_geometry_address_area_candidates ON placex
USING gist (geometry) {{db.tablespace.address_index}}
WHERE rank_address between 1 and 25
and ST_GeometryType(geometry) in ('ST_Polygon','ST_MultiPolygon');
-- Usage: - POI is within building with housenumber
CREATE INDEX idx_placex_osmid ON placex USING BTREE (osm_type, osm_id) {{db.tablespace.search_index}};
CREATE INDEX idx_placex_linked_place_id ON placex USING BTREE (linked_place_id) {{db.tablespace.address_index}} WHERE linked_place_id IS NOT NULL;
CREATE INDEX idx_placex_rank_search ON placex USING BTREE (rank_search, geometry_sector) {{db.tablespace.address_index}};
CREATE INDEX idx_placex_geometry ON placex USING GIST (geometry) {{db.tablespace.search_index}};
CREATE INDEX idx_placex_geometry_buildings ON placex
USING {{postgres.spgist_geom}} (geometry) {{db.tablespace.address_index}}
USING {{postgres.spgist_geom}} (geometry) {{db.tablespace.search_index}}
WHERE address is not null and rank_search = 30
and ST_GeometryType(geometry) in ('ST_Polygon','ST_MultiPolygon');
-- Usage: - linking of similar named places to boundaries
-- - linking of place nodes with same type to boundaries
-- - lookupPolygon()
CREATE INDEX idx_placex_geometry_placenode ON placex
USING {{postgres.spgist_geom}} (geometry) {{db.tablespace.address_index}}
USING {{postgres.spgist_geom}} (geometry) {{db.tablespace.search_index}}
WHERE osm_type = 'N' and rank_search < 26
and class = 'place' and type != 'postcode';
-- Usage: - is node part of a way?
-- - find parent of interpolation spatially
CREATE INDEX idx_placex_geometry_lower_rank_ways ON placex
USING {{postgres.spgist_geom}} (geometry) {{db.tablespace.address_index}}
WHERE osm_type = 'W' and rank_search >= 26;
-- Usage: - linking place nodes by wikidata tag to boundaries
CREATE INDEX idx_placex_wikidata on placex
USING BTREE ((extratags -> 'wikidata')) {{db.tablespace.address_index}}
WHERE extratags ? 'wikidata' and class = 'place'
and osm_type = 'N' and rank_search < 26;
-- The following two indexes function as a todo list for indexing.
CREATE INDEX idx_placex_rank_address_sector ON placex
USING BTREE (rank_address, geometry_sector) {{db.tablespace.address_index}}
WHERE indexed_status > 0;
CREATE INDEX idx_placex_rank_boundaries_sector ON placex
USING BTREE (rank_search, geometry_sector) {{db.tablespace.address_index}}
WHERE class = 'boundary' and type = 'administrative'
and indexed_status > 0;
and class = 'place' and type != 'postcode' and linked_place_id is null;
CREATE INDEX idx_placex_wikidata on placex USING BTREE ((extratags -> 'wikidata')) {{db.tablespace.address_index}} WHERE extratags ? 'wikidata' and class = 'place' and osm_type = 'N' and rank_search < 26;
DROP SEQUENCE IF EXISTS seq_place;
CREATE SEQUENCE seq_place start 1;
@@ -276,7 +228,7 @@ CREATE SEQUENCE file start 1;
-- null table so it won't error
-- deliberately no drop - importing the table is expensive and static, if it is already there better to avoid removing it
CREATE TABLE IF NOT EXISTS wikipedia_article (
CREATE TABLE wikipedia_article (
language text NOT NULL,
title text NOT NULL,
langcount integer,
@@ -290,12 +242,15 @@ CREATE TABLE IF NOT EXISTS wikipedia_article (
wd_page_title text,
instance_of text
);
ALTER TABLE ONLY wikipedia_article ADD CONSTRAINT wikipedia_article_pkey PRIMARY KEY (language, title);
CREATE INDEX idx_wikipedia_article_osm_id ON wikipedia_article USING btree (osm_type, osm_id);
CREATE TABLE IF NOT EXISTS wikipedia_redirect (
CREATE TABLE wikipedia_redirect (
language text,
from_title text,
to_title text
);
ALTER TABLE ONLY wikipedia_redirect ADD CONSTRAINT wikipedia_redirect_pkey PRIMARY KEY (language, from_title);
-- osm2pgsql does not create indexes on the middle tables for Nominatim
-- Add one for lookup of associated street relations.

View File

@@ -1,6 +1,6 @@
# just use the pgxs makefile
foreach(suffix ${PostgreSQL_ADDITIONAL_VERSIONS} "15" "14" "13" "12" "11" "10" "9.6")
foreach(suffix ${PostgreSQL_ADDITIONAL_VERSIONS} "14" "13" "12" "11" "10" "9.6")
list(APPEND PG_CONFIG_HINTS
"/usr/pgsql-${suffix}/bin")
endforeach()

View File

@@ -76,25 +76,21 @@ class UpdateAddData:
osm2pgsql_params = args.osm2pgsql_options(default_cache=1000, default_threads=1)
if args.file or args.diff:
return add_osm_data.add_data_from_file(args.config.get_libpq_dsn(),
cast(str, args.file or args.diff),
return add_osm_data.add_data_from_file(cast(str, args.file or args.diff),
osm2pgsql_params)
if args.node:
return add_osm_data.add_osm_object(args.config.get_libpq_dsn(),
'node', args.node,
return add_osm_data.add_osm_object('node', args.node,
args.use_main_api,
osm2pgsql_params)
if args.way:
return add_osm_data.add_osm_object(args.config.get_libpq_dsn(),
'way', args.way,
return add_osm_data.add_osm_object('way', args.way,
args.use_main_api,
osm2pgsql_params)
if args.relation:
return add_osm_data.add_osm_object(args.config.get_libpq_dsn(),
'relation', args.relation,
return add_osm_data.add_osm_object('relation', args.relation,
args.use_main_api,
osm2pgsql_params)

View File

@@ -20,7 +20,6 @@ from nominatim.clicmd.args import NominatimArgs
LOG = logging.getLogger()
class AdminFuncs:
"""\
Analyse and maintain the database.
@@ -37,8 +36,6 @@ class AdminFuncs:
help='Migrate the database to a new software version')
objs.add_argument('--analyse-indexing', action='store_true',
help='Print performance analysis of the indexing process')
objs.add_argument('--collect-os-info', action="store_true",
help="Generate a report about the host system information")
group = parser.add_argument_group('Arguments for cache warming')
group.add_argument('--search-only', action='store_const', dest='target',
const='search',
@@ -73,14 +70,9 @@ class AdminFuncs:
from ..tools import migration
return migration.migrate(args.config, args)
if args.collect_os_info:
LOG.warning("Reporting System Information")
from ..tools import collect_os_info
collect_os_info.report_system_information(args.config)
return 0
return 1
def _warm(self, args: NominatimArgs) -> int:
LOG.warning('Warming database caches')
params = ['warm.php']

View File

@@ -248,9 +248,9 @@ class APIDetails:
if args.node:
params = dict(osmtype='N', osmid=args.node)
elif args.way:
params = dict(osmtype='W', osmid=args.way)
params = dict(osmtype='W', osmid=args.node)
elif args.relation:
params = dict(osmtype='R', osmid=args.relation)
params = dict(osmtype='R', osmid=args.node)
else:
params = dict(place_id=args.place_id)
if args.object_class:

View File

@@ -76,7 +76,6 @@ class NominatimArgs:
warm: bool
check_database: bool
migrate: bool
collect_os_info: bool
analyse_indexing: bool
target: Optional[str]
osm_id: Optional[str]
@@ -115,7 +114,6 @@ class NominatimArgs:
address_levels: bool
functions: bool
wiki_data: bool
secondary_importance: bool
importance: bool
website: bool
diffs: bool
@@ -184,10 +182,8 @@ class NominatimArgs:
return dict(osm2pgsql=self.config.OSM2PGSQL_BINARY or self.osm2pgsql_path,
osm2pgsql_cache=self.osm2pgsql_cache or default_cache,
osm2pgsql_style=self.config.get_import_style_file(),
osm2pgsql_style_path=self.config.config_dir,
threads=self.threads or default_threads,
dsn=self.config.get_libpq_dsn(),
forward_dependencies=self.config.get_bool('UPDATE_FORWARD_DEPENDENCIES'),
flatnode_file=str(self.config.get_path('FLATNODE_FILE') or ''),
tablespaces=dict(slim_data=self.config.TABLESPACE_OSM_DATA,
slim_index=self.config.TABLESPACE_OSM_INDEX,

View File

@@ -63,8 +63,6 @@ class UpdateRefresh:
help='Update the PL/pgSQL functions in the database')
group.add_argument('--wiki-data', action='store_true',
help='Update Wikipedia/data importance numbers')
group.add_argument('--secondary-importance', action='store_true',
help='Update secondary importance raster data')
group.add_argument('--importance', action='store_true',
help='Recompute place importances (expensive!)')
group.add_argument('--website', action='store_true',
@@ -85,7 +83,7 @@ class UpdateRefresh:
help='Enable debug warning statements in functions')
def run(self, args: NominatimArgs) -> int: #pylint: disable=too-many-branches, too-many-statements
def run(self, args: NominatimArgs) -> int: #pylint: disable=too-many-branches
from ..tools import refresh, postcodes
from ..indexer.indexer import Indexer
@@ -117,20 +115,6 @@ class UpdateRefresh:
with connect(args.config.get_libpq_dsn()) as conn:
refresh.load_address_levels_from_config(conn, args.config)
# Attention: must come BEFORE functions
if args.secondary_importance:
with connect(args.config.get_libpq_dsn()) as conn:
# If the table did not exist before, then the importance code
# needs to be enabled.
if not conn.table_exists('secondary_importance'):
args.functions = True
LOG.warning('Import secondary importance raster data from %s', args.project_dir)
if refresh.import_secondary_importance(args.config.get_libpq_dsn(),
args.project_dir) > 0:
LOG.fatal('FATAL: Cannot update sendary importance raster data')
return 1
if args.functions:
LOG.warning('Create functions')
with connect(args.config.get_libpq_dsn()) as conn:

View File

@@ -15,7 +15,7 @@ from pathlib import Path
import psutil
from nominatim.config import Configuration
from nominatim.db.connection import connect
from nominatim.db.connection import connect, Connection
from nominatim.db import status, properties
from nominatim.tokenizer.base import AbstractTokenizer
from nominatim.version import version_str
@@ -59,7 +59,7 @@ class SetupAll:
help="Do not keep tables that are only needed for "
"updating the database later")
group2.add_argument('--offline', action='store_true',
help="Do not attempt to load any additional data from the internet")
help="Do not attempt to load any additional data from the internet")
group3 = parser.add_argument_group('Expert options')
group3.add_argument('--ignore-errors', action='store_true',
help='Continue import even when errors in SQL are present')
@@ -72,8 +72,6 @@ class SetupAll:
from ..tools import database_import, refresh, postcodes, freeze
from ..indexer.indexer import Indexer
num_threads = args.threads or psutil.cpu_count() or 1
country_info.setup_country_config(args.config)
if args.continue_at is None:
@@ -96,21 +94,14 @@ class SetupAll:
drop=args.no_updates,
ignore_errors=args.ignore_errors)
self._setup_tables(args.config, args.reverse_only)
LOG.warning('Importing wikipedia importance data')
data_path = Path(args.config.WIKIPEDIA_DATA_PATH or args.project_dir)
if refresh.import_wikipedia_articles(args.config.get_libpq_dsn(),
data_path) > 0:
LOG.error('Wikipedia importance dump file not found. '
'Calculating importance values of locations will not '
'use Wikipedia importance data.')
LOG.warning('Importing secondary importance raster data')
if refresh.import_secondary_importance(args.config.get_libpq_dsn(),
args.project_dir) != 0:
LOG.error('Secondary importance file not imported. '
'Falling back to default ranking.')
self._setup_tables(args.config, args.reverse_only)
'Will be using default importances.')
if args.continue_at is None or args.continue_at == 'load-data':
LOG.warning('Initialise tables')
@@ -118,7 +109,8 @@ class SetupAll:
database_import.truncate_data_tables(conn)
LOG.warning('Load data into placex table')
database_import.load_data(args.config.get_libpq_dsn(), num_threads)
database_import.load_data(args.config.get_libpq_dsn(),
args.threads or psutil.cpu_count() or 1)
LOG.warning("Setting up tokenizer")
tokenizer = self._get_tokenizer(args.continue_at, args.config)
@@ -129,15 +121,18 @@ class SetupAll:
args.project_dir, tokenizer)
if args.continue_at is None or args.continue_at in ('load-data', 'indexing'):
if args.continue_at is not None and args.continue_at != 'load-data':
with connect(args.config.get_libpq_dsn()) as conn:
self._create_pending_index(conn, args.config.TABLESPACE_ADDRESS_INDEX)
LOG.warning('Indexing places')
indexer = Indexer(args.config.get_libpq_dsn(), tokenizer, num_threads)
indexer = Indexer(args.config.get_libpq_dsn(), tokenizer,
args.threads or psutil.cpu_count() or 1)
indexer.index_full(analyse=not args.index_noanalyse)
LOG.warning('Post-process tables')
with connect(args.config.get_libpq_dsn()) as conn:
database_import.create_search_indices(conn, args.config,
drop=args.no_updates,
threads=num_threads)
drop=args.no_updates)
LOG.warning('Create search index for default country names.')
country_info.create_country_names(conn, tokenizer,
args.config.get_str_list('LANGUAGES'))
@@ -193,6 +188,27 @@ class SetupAll:
return tokenizer_factory.get_tokenizer_for_db(config)
def _create_pending_index(self, conn: Connection, tablespace: str) -> None:
""" Add a supporting index for finding places still to be indexed.
This index is normally created at the end of the import process
for later updates. When indexing was partially done, then this
index can greatly improve speed going through already indexed data.
"""
if conn.index_exists('idx_placex_pendingsector'):
return
with conn.cursor() as cur:
LOG.warning('Creating support index')
if tablespace:
tablespace = 'TABLESPACE ' + tablespace
cur.execute(f"""CREATE INDEX idx_placex_pendingsector
ON placex USING BTREE (rank_address,geometry_sector)
{tablespace} WHERE indexed_status > 0
""")
conn.commit()
def _finalize_database(self, dsn: str, offline: bool) -> None:
""" Determine the database date and set the status accordingly.
"""

View File

@@ -69,8 +69,8 @@ class DBConnection:
self.current_params: Optional[Sequence[Any]] = None
self.ignore_sql_errors = ignore_sql_errors
self.conn: Optional['psycopg2._psycopg.connection'] = None
self.cursor: Optional['psycopg2._psycopg.cursor'] = None
self.conn: Optional['psycopg2.connection'] = None
self.cursor: Optional['psycopg2.cursor'] = None
self.connect(cursor_factory=cursor_factory)
def close(self) -> None:
@@ -78,7 +78,7 @@ class DBConnection:
"""
if self.conn is not None:
if self.cursor is not None:
self.cursor.close()
self.cursor.close() # type: ignore[no-untyped-call]
self.cursor = None
self.conn.close()

View File

@@ -31,7 +31,7 @@ class Cursor(psycopg2.extras.DictCursor):
""" Query execution that logs the SQL query when debugging is enabled.
"""
if LOG.isEnabledFor(logging.DEBUG):
LOG.debug(self.mogrify(query, args).decode('utf-8'))
LOG.debug(self.mogrify(query, args).decode('utf-8')) # type: ignore[no-untyped-call]
super().execute(query, args)

View File

@@ -11,7 +11,6 @@ from typing import Set, Dict, Any
import jinja2
from nominatim.db.connection import Connection
from nominatim.db.async_connection import WorkerPool
from nominatim.config import Configuration
def _get_partitions(conn: Connection) -> Set[int]:
@@ -97,21 +96,3 @@ class SQLPreprocessor:
with conn.cursor() as cur:
cur.execute(sql)
conn.commit()
def run_parallel_sql_file(self, dsn: str, name: str, num_threads: int = 1,
**kwargs: Any) -> None:
""" Execure the given SQL files using parallel asynchronous connections.
The keyword arguments may supply additional parameters for
preprocessing.
After preprocessing the SQL code is cut at lines containing only
'---'. Each chunk is sent to one of the `num_threads` workers.
"""
sql = self.env.get_template(name).render(**kwargs)
parts = sql.split('\n---\n')
with WorkerPool(dsn, num_threads) as pool:
for part in parts:
pool.next_free_worker().perform(part)

View File

@@ -118,4 +118,4 @@ class CopyBuffer:
"""
if self.buffer.tell() > 0:
self.buffer.seek(0)
cur.copy_from(self.buffer, table, columns=columns)
cur.copy_from(self.buffer, table, columns=columns) # type: ignore[no-untyped-call]

View File

@@ -128,64 +128,58 @@ class Indexer:
with conn.cursor() as cur:
cur.execute('ANALYZE')
if self.index_by_rank(0, 4) > 0:
_analyze()
self.index_by_rank(0, 4)
_analyze()
if self.index_boundaries(0, 30) > 100:
_analyze()
self.index_boundaries(0, 30)
_analyze()
if self.index_by_rank(5, 25) > 100:
_analyze()
self.index_by_rank(5, 25)
_analyze()
if self.index_by_rank(26, 30) > 1000:
_analyze()
self.index_by_rank(26, 30)
_analyze()
if self.index_postcodes() > 100:
_analyze()
self.index_postcodes()
_analyze()
def index_boundaries(self, minrank: int, maxrank: int) -> int:
def index_boundaries(self, minrank: int, maxrank: int) -> None:
""" Index only administrative boundaries within the given rank range.
"""
total = 0
LOG.warning("Starting indexing boundaries using %s threads",
self.num_threads)
with self.tokenizer.name_analyzer() as analyzer:
for rank in range(max(minrank, 4), min(maxrank, 26)):
total += self._index(runners.BoundaryRunner(rank, analyzer))
self._index(runners.BoundaryRunner(rank, analyzer))
return total
def index_by_rank(self, minrank: int, maxrank: int) -> int:
def index_by_rank(self, minrank: int, maxrank: int) -> None:
""" Index all entries of placex in the given rank range (inclusive)
in order of their address rank.
When rank 30 is requested then also interpolations and
places with address rank 0 will be indexed.
"""
total = 0
maxrank = min(maxrank, 30)
LOG.warning("Starting indexing rank (%i to %i) using %i threads",
minrank, maxrank, self.num_threads)
with self.tokenizer.name_analyzer() as analyzer:
for rank in range(max(1, minrank), maxrank + 1):
total += self._index(runners.RankRunner(rank, analyzer), 20 if rank == 30 else 1)
self._index(runners.RankRunner(rank, analyzer), 20 if rank == 30 else 1)
if maxrank == 30:
total += self._index(runners.RankRunner(0, analyzer))
total += self._index(runners.InterpolationRunner(analyzer), 20)
return total
self._index(runners.RankRunner(0, analyzer))
self._index(runners.InterpolationRunner(analyzer), 20)
def index_postcodes(self) -> int:
def index_postcodes(self) -> None:
"""Index the entries of the location_postcode table.
"""
LOG.warning("Starting indexing postcodes using %s threads", self.num_threads)
return self._index(runners.PostcodeRunner(), 20)
self._index(runners.PostcodeRunner(), 20)
def update_status_table(self) -> None:
@@ -197,7 +191,7 @@ class Indexer:
conn.commit()
def _index(self, runner: runners.Runner, batch: int = 1) -> int:
def _index(self, runner: runners.Runner, batch: int = 1) -> None:
""" Index a single rank or table. `runner` describes the SQL to use
for indexing. `batch` describes the number of objects that
should be processed with a single SQL statement
@@ -239,4 +233,4 @@ class Indexer:
conn.commit()
return progress.done()
progress.done()

View File

@@ -55,7 +55,7 @@ class ProgressLogger:
self.next_info += int(places_per_sec) * self.log_interval
def done(self) -> int:
def done(self) -> None:
""" Print final statistics about the progress.
"""
rank_end_time = datetime.now()
@@ -70,5 +70,3 @@ class ProgressLogger:
LOG.warning("Done %d/%d in %d @ %.3f per second - FINISHED %s\n",
self.done_places, self.total_places, int(diff_seconds),
places_per_sec, self.name)
return self.done_places

View File

@@ -1,46 +0,0 @@
# SPDX-License-Identifier: GPL-2.0-only
#
# This file is part of Nominatim. (https://nominatim.org)
#
# Copyright (C) 2022 by the Nominatim developer community.
# For a full list of authors see the git log.
"""
Sanitizer that preprocesses tags from the TIGER import.
It makes the following changes:
* remove state reference from tiger:county
"""
from typing import Callable
import re
from nominatim.tokenizer.sanitizers.base import ProcessInfo
from nominatim.tokenizer.sanitizers.config import SanitizerConfig
COUNTY_MATCH = re.compile('(.*), [A-Z][A-Z]')
def _clean_tiger_county(obj: ProcessInfo) -> None:
""" Remove the state reference from tiger:county tags.
This transforms a name like 'Hamilton, AL' into 'Hamilton'.
If no state reference is detected at the end, the name is left as is.
"""
if not obj.address:
return
for item in obj.address:
if item.kind == 'tiger' and item.suffix == 'county':
m = COUNTY_MATCH.fullmatch(item.name)
if m:
item.name = m[1]
# Switch kind and suffix, the split left them reversed.
item.kind = 'county'
item.suffix = 'tiger'
return
def create(_: SanitizerConfig) -> Callable[[ProcessInfo], None]:
""" Create a housenumber processing function.
"""
return _clean_tiger_county

View File

@@ -12,34 +12,23 @@ from pathlib import Path
import logging
import urllib
from nominatim.db.connection import connect
from nominatim.tools.exec_utils import run_osm2pgsql, get_url
LOG = logging.getLogger()
def _run_osm2pgsql(dsn: str, options: MutableMapping[str, Any]) -> None:
run_osm2pgsql(options)
# Handle deletions
with connect(dsn) as conn:
with conn.cursor() as cur:
cur.execute('SELECT flush_deleted_places()')
conn.commit()
def add_data_from_file(dsn: str, fname: str, options: MutableMapping[str, Any]) -> int:
def add_data_from_file(fname: str, options: MutableMapping[str, Any]) -> int:
""" Adds data from a OSM file to the database. The file may be a normal
OSM file or a diff file in all formats supported by libosmium.
"""
options['import_file'] = Path(fname)
options['append'] = True
_run_osm2pgsql(dsn, options)
run_osm2pgsql(options)
# No status update. We don't know where the file came from.
return 0
def add_osm_object(dsn: str, osm_type: str, osm_id: int, use_main_api: bool,
def add_osm_object(osm_type: str, osm_id: int, use_main_api: bool,
options: MutableMapping[str, Any]) -> int:
""" Add or update a single OSM object from the latest version of the
API.
@@ -62,6 +51,6 @@ def add_osm_object(dsn: str, osm_type: str, osm_id: int, use_main_api: bool,
options['append'] = True
options['import_data'] = get_url(base_url).encode('utf-8')
_run_osm2pgsql(dsn, options)
run_osm2pgsql(options)
return 0

View File

@@ -114,10 +114,9 @@ def _get_indexes(conn: Connection) -> List[str]:
indexes.extend(('idx_placex_housenumber',
'idx_osmline_parent_osm_id_with_hnr'))
if conn.table_exists('place'):
indexes.extend(('idx_location_area_country_place_id',
'idx_place_osm_unique',
'idx_placex_rank_address_sector',
'idx_placex_rank_boundaries_sector'))
indexes.extend(('idx_placex_pendingsector',
'idx_location_area_country_place_id',
'idx_place_osm_unique'))
return indexes
@@ -200,7 +199,7 @@ def check_tokenizer(_: Connection, config: Configuration) -> CheckResult:
def check_existance_wikipedia(conn: Connection, _: Configuration) -> CheckResult:
""" Checking for wikipedia/wikidata data
"""
if not conn.table_exists('search_name') or not conn.table_exists('place'):
if not conn.table_exists('search_name'):
return CheckState.NOT_APPLICABLE
with conn.cursor() as cur:

View File

@@ -1,166 +0,0 @@
# SPDX-License-Identifier: GPL-2.0-only
#
# This file is part of Nominatim. (https://nominatim.org)
#
# Copyright (C) 2022 by the Nominatim developer community.
# For a full list of authors see the git log.
"""
Collection of host system information including software versions, memory,
storage, and database configuration.
"""
import os
import subprocess
import sys
from pathlib import Path
from typing import List, Optional, Tuple, Union
import psutil
from psycopg2.extensions import make_dsn, parse_dsn
from nominatim.config import Configuration
from nominatim.db.connection import connect
from nominatim.version import version_str
def convert_version(ver_tup: Tuple[int, int]) -> str:
"""converts tuple version (ver_tup) to a string representation"""
return ".".join(map(str, ver_tup))
def friendly_memory_string(mem: float) -> str:
"""Create a user friendly string for the amount of memory specified as mem"""
mem_magnitude = ("bytes", "KB", "MB", "GB", "TB", "PB", "EB", "ZB", "YB")
mag = 0
# determine order of magnitude
while mem > 1000:
mem /= 1000
mag += 1
return f"{mem:.1f} {mem_magnitude[mag]}"
def run_command(cmd: Union[str, List[str]]) -> str:
"""Runs a command using the shell and returns the output from stdout"""
try:
if sys.version_info < (3, 7):
cap_out = subprocess.run(cmd, stdout=subprocess.PIPE, check=False)
else:
cap_out = subprocess.run(cmd, capture_output=True, check=False)
return cap_out.stdout.decode("utf-8")
except FileNotFoundError:
# non-Linux system should end up here
return f"Unknown (unable to find the '{cmd}' command)"
def os_name_info() -> str:
"""Obtain Operating System Name (and possibly the version)"""
os_info = None
# man page os-release(5) details meaning of the fields
if Path("/etc/os-release").is_file():
os_info = from_file_find_line_portion(
"/etc/os-release", "PRETTY_NAME", "=")
# alternative location
elif Path("/usr/lib/os-release").is_file():
os_info = from_file_find_line_portion(
"/usr/lib/os-release", "PRETTY_NAME", "="
)
# fallback on Python's os name
if os_info is None or os_info == "":
os_info = os.name
# if the above is insufficient, take a look at neofetch's approach to OS detection
return os_info
# Note: Intended to be used on informational files like /proc
def from_file_find_line_portion(
filename: str, start: str, sep: str, fieldnum: int = 1
) -> Optional[str]:
"""open filename, finds the line starting with the 'start' string.
Splits the line using seperator and returns a "fieldnum" from the split."""
with open(filename, encoding='utf8') as file:
result = ""
for line in file:
if line.startswith(start):
result = line.split(sep)[fieldnum].strip()
return result
def get_postgresql_config(version: int) -> str:
"""Retrieve postgres configuration file"""
try:
with open(f"/etc/postgresql/{version}/main/postgresql.conf", encoding='utf8') as file:
db_config = file.read()
file.close()
return db_config
except IOError:
return f"**Could not read '/etc/postgresql/{version}/main/postgresql.conf'**"
def report_system_information(config: Configuration) -> None:
"""Generate a report about the host system including software versions, memory,
storage, and database configuration."""
with connect(make_dsn(config.get_libpq_dsn(), dbname='postgres')) as conn:
postgresql_ver: str = convert_version(conn.server_version_tuple())
with conn.cursor() as cur:
num = cur.scalar("SELECT count(*) FROM pg_catalog.pg_database WHERE datname=%s",
(parse_dsn(config.get_libpq_dsn())['dbname'], ))
nominatim_db_exists = num == 1 if isinstance(num, int) else False
if nominatim_db_exists:
with connect(config.get_libpq_dsn()) as conn:
postgis_ver: str = convert_version(conn.postgis_version_tuple())
else:
postgis_ver = "Unable to connect to database"
postgresql_config: str = get_postgresql_config(int(float(postgresql_ver)))
# Note: psutil.disk_partitions() is similar to run_command("lsblk")
# Note: run_command("systemd-detect-virt") only works on Linux, on other OSes
# should give a message: "Unknown (unable to find the 'systemd-detect-virt' command)"
# Generates the Markdown report.
report = f"""
**Instructions**
Use this information in your issue report at https://github.com/osm-search/Nominatim/issues
Redirect the output to a file:
$ ./collect_os_info.py > report.md
**Software Environment:**
- Python version: {sys.version}
- Nominatim version: {version_str()}
- PostgreSQL version: {postgresql_ver}
- PostGIS version: {postgis_ver}
- OS: {os_name_info()}
**Hardware Configuration:**
- RAM: {friendly_memory_string(psutil.virtual_memory().total)}
- number of CPUs: {psutil.cpu_count(logical=False)}
- bare metal/AWS/other cloud service (per systemd-detect-virt(1)): {run_command("systemd-detect-virt")}
- type and size of disks:
**`df -h` - df - report file system disk space usage: **
```
{run_command(["df", "-h"])}
```
**lsblk - list block devices: **
```
{run_command("lsblk")}
```
**Postgresql Configuration:**
```
{postgresql_config}
```
**Notes**
Please add any notes about anything above anything above that is incorrect.
"""
print(report)

View File

@@ -75,11 +75,6 @@ def setup_database_skeleton(dsn: str, rouser: Optional[str] = None) -> None:
with conn.cursor() as cur:
cur.execute('CREATE EXTENSION IF NOT EXISTS hstore')
cur.execute('CREATE EXTENSION IF NOT EXISTS postgis')
postgis_version = conn.postgis_version_tuple()
if postgis_version[0] >= 3:
cur.execute('CREATE EXTENSION IF NOT EXISTS postgis_raster')
conn.commit()
_require_version('PostGIS',
@@ -230,8 +225,7 @@ def load_data(dsn: str, threads: int) -> None:
cur.execute('ANALYSE')
def create_search_indices(conn: Connection, config: Configuration,
drop: bool = False, threads: int = 1) -> None:
def create_search_indices(conn: Connection, config: Configuration, drop: bool = False) -> None:
""" Create tables that have explicit partitioning.
"""
@@ -249,5 +243,4 @@ def create_search_indices(conn: Connection, config: Configuration,
sql = SQLPreprocessor(conn, config)
sql.run_parallel_sql_file(config.get_libpq_dsn(),
'indices.sql', min(8, threads), drop=drop)
sql.run_sql_file(conn, 'indices.sql', drop=drop)

View File

@@ -10,7 +10,6 @@ Helper functions for executing external programs.
from typing import Any, Union, Optional, Mapping, IO
from pathlib import Path
import logging
import os
import subprocess
import urllib.request as urlrequest
from urllib.parse import urlencode
@@ -117,27 +116,21 @@ def run_osm2pgsql(options: Mapping[str, Any]) -> None:
env = get_pg_env(options['dsn'])
cmd = [str(options['osm2pgsql']),
'--hstore', '--latlon', '--slim',
'--with-forward-dependencies', 'false',
'--log-progress', 'true',
'--number-processes', '1' if options['append'] else str(options['threads']),
'--number-processes', str(options['threads']),
'--cache', str(options['osm2pgsql_cache']),
'--output', 'gazetteer',
'--style', str(options['osm2pgsql_style'])
]
if str(options['osm2pgsql_style']).endswith('.lua'):
env['LUA_PATH'] = ';'.join((str(options['osm2pgsql_style_path'] / 'flex-base.lua'),
os.environ.get('LUAPATH', ';')))
cmd.extend(('--output', 'flex'))
if options['append']:
cmd.append('--append')
else:
cmd.extend(('--output', 'gazetteer'))
cmd.append('--append' if options['append'] else '--create')
cmd.append('--create')
if options['flatnode_file']:
cmd.extend(('--flat-nodes', options['flatnode_file']))
if not options.get('forward_dependencies', False):
cmd.extend(('--with-forward-dependencies', 'false'))
for key, param in (('slim_data', '--tablespace-slim-data'),
('slim_index', '--tablespace-slim-index'),
('main_data', '--tablespace-main-data'),

View File

@@ -315,36 +315,3 @@ def mark_internal_country_names(conn: Connection, config: Configuration, **_: An
names = {}
names['countrycode'] = country_code
analyzer.add_country_names(country_code, names)
@_migration(4, 1, 99, 0)
def add_place_deletion_todo_table(conn: Connection, **_: Any) -> None:
""" Add helper table for deleting data on updates.
The table is only necessary when updates are possible, i.e.
the database is not in freeze mode.
"""
if conn.table_exists('place'):
with conn.cursor() as cur:
cur.execute("""CREATE TABLE IF NOT EXISTS place_to_be_deleted (
osm_type CHAR(1),
osm_id BIGINT,
class TEXT,
type TEXT,
deferred BOOLEAN)""")
@_migration(4, 1, 99, 1)
def split_pending_index(conn: Connection, **_: Any) -> None:
""" Reorganise indexes for pending updates.
"""
if conn.table_exists('place'):
with conn.cursor() as cur:
cur.execute("""CREATE INDEX IF NOT EXISTS idx_placex_rank_address_sector
ON placex USING BTREE (rank_address, geometry_sector)
WHERE indexed_status > 0""")
cur.execute("""CREATE INDEX IF NOT EXISTS idx_placex_rank_boundaries_sector
ON placex USING BTREE (rank_search, geometry_sector)
WHERE class = 'boundary' and type = 'administrative'
and indexed_status > 0""")
cur.execute("DROP INDEX IF EXISTS idx_placex_pendingsector")

View File

@@ -15,7 +15,7 @@ from pathlib import Path
from psycopg2 import sql as pysql
from nominatim.config import Configuration
from nominatim.db.connection import Connection, connect
from nominatim.db.connection import Connection
from nominatim.db.utils import execute_file
from nominatim.db.sql_preprocessor import SQLPreprocessor
from nominatim.version import version_str
@@ -146,25 +146,6 @@ def import_wikipedia_articles(dsn: str, data_path: Path, ignore_errors: bool = F
return 0
def import_secondary_importance(dsn: str, data_path: Path, ignore_errors: bool = False) -> int:
""" Replaces the secondary importance raster data table with new data.
Returns 0 if all was well and 1 if the raster SQL file could not
be found. Throws an exception if there was an error reading the file.
"""
datafile = data_path / 'secondary_importance.sql.gz'
if not datafile.exists():
return 1
with connect(dsn) as conn:
postgis_version = conn.postgis_version_tuple()
if postgis_version[0] < 3:
LOG.error('PostGIS version is too old for using OSM raster data.')
return 2
execute_file(dsn, datafile, ignore_errors=ignore_errors)
return 0
def recompute_importance(conn: Connection) -> None:
""" Recompute wikipedia links and importance for all entries in placex.
@@ -176,7 +157,7 @@ def recompute_importance(conn: Connection) -> None:
cur.execute("""
UPDATE placex SET (wikipedia, importance) =
(SELECT wikipedia, importance
FROM compute_importance(extratags, country_code, rank_search, centroid))
FROM compute_importance(extratags, country_code, osm_type, osm_id))
""")
cur.execute("""
UPDATE placex s SET wikipedia = d.wikipedia, importance = d.importance

View File

@@ -130,7 +130,10 @@ def update(conn: Connection, options: MutableMapping[str, Any],
if endseq is None:
return UpdateState.NO_CHANGES
run_osm2pgsql_updates(conn, options)
# Consume updates with osm2pgsql.
options['append'] = True
options['disable_jit'] = conn.server_version_tuple() >= (11, 0)
run_osm2pgsql(options)
# Write the current status to the file
endstate = repl.get_state_info(endseq)
@@ -140,25 +143,6 @@ def update(conn: Connection, options: MutableMapping[str, Any],
return UpdateState.UP_TO_DATE
def run_osm2pgsql_updates(conn: Connection, options: MutableMapping[str, Any]) -> None:
""" Run osm2pgsql in append mode.
"""
# Remove any stale deletion marks.
with conn.cursor() as cur:
cur.execute('TRUNCATE place_to_be_deleted')
conn.commit()
# Consume updates with osm2pgsql.
options['append'] = True
options['disable_jit'] = conn.server_version_tuple() >= (11, 0)
run_osm2pgsql(options)
# Handle deletions
with conn.cursor() as cur:
cur.execute('SELECT flush_deleted_places()')
conn.commit()
def _make_replication_server(url: str, timeout: int) -> ContextManager[ReplicationServer]:
""" Returns a ReplicationServer in form of a context manager.

View File

@@ -2,7 +2,7 @@
#
# This file is part of Nominatim. (https://nominatim.org)
#
# Copyright (C) 2023 by the Nominatim developer community.
# Copyright (C) 2022 by the Nominatim developer community.
# For a full list of authors see the git log.
"""
Version information for Nominatim.
@@ -25,7 +25,7 @@ from typing import Optional, Tuple
# patch level when cherry-picking the commit with the migration.
#
# Released versions always have a database patch level of 0.
NOMINATIM_VERSION = (4, 2, 4, 0)
NOMINATIM_VERSION = (4, 1, 1, 0)
POSTGRESQL_REQUIRED_VERSION = (9, 6)
POSTGIS_REQUIRED_VERSION = (2, 2)

View File

@@ -0,0 +1,71 @@
name:
default: De Nederlandse Antillen
af: Nederlandse Antille
an: Antillas Neerlandesas
ar: جزر الأنتيل
be: Нідэрландскія Антылы
bg: Холандски Антили
br: Antilhez Nederlandat
bs: Holandski Antili
ca: Antilles Neerlandeses
cs: Nizozemské Antily
cy: Antilles yr Iseldiroedd
da: Nederlandske Antiller
de: Niederländische Antillen
dv: ނެދަލޭންޑު އެންޓިލޭ
el: Ολλανδικές Αντίλλες
en: Netherlands Antilles
eo: Nederlandaj Antiloj
es: Antillas Neerlandesas;Antillas Holandesas;Indias Occidentales Holandesas
et: Hollandi Antillid
eu: Holandarren Antillak
fa: آنتیل هلند
fi: Alankomaiden Antillit
fo: Niðurlendsku Antillurnar
fr: Antilles néerlandaises
fy: Nederlânske Antillen
ga: Aintillí na hÍsiltíre
gl: Antillas Neerlandesas
he: האנטילים ההולנדיים
hi: नीदरलैंड एंटीलीज़
hr: Nizozemski Antili
hu: Holland Antillák
ia: Antillas Nederlandese
id: Antillen Belanda
io: Nederlandana Antili
is: Hollensku Antillaeyjar
it: Antille Olandesi
ja: オランダ領アンティル
jv: Antillen Walanda
ka: ნიდერლანდის ანტილები
kk: Антийлер
ko: 네덜란드령 안틸레스
kw: Antillys Iseldiryek
la: Antillae Nederlandiae
lb: Hollännesch Antillen
li: Nederlandse Antille
ln: Antiya ya Holanda
lt: Nyderlandų Antilai
lv: Antiļas
mn: Нидерландын Антиллийн Арлууд
mr: नेदरलँड्स अँटिल्स
ms: Antillen Belanda
nn: Dei nederlandske Antillane
"no": De nederlandske Antillene
pl: Antyle Holenderskie
pt: Antilhas Holandesas
ro: Antilele Olandeze
ru: Нидерландские Антилы
sh: Nizozemski Antili
sk: Holandské Antily
sl: Nizozemski Antili
sr: Холандски Антили
sv: Nederländska Antillerna
sw: Antili za Kiholanzi
ta: நெதர்லாந்து அண்டிலிசு
tg: Антил Ҳоланд
th: เนเธอร์แลนด์แอนทิลลิส
tr: Hollanda Antilleri
uk: Нідерландські Антильські острови
vi: Antille thuộc Hà Lan
zh: 荷属安的列斯

View File

@@ -0,0 +1,2 @@
name:
default: Antarctica

View File

@@ -0,0 +1,2 @@
name:
default: American Samoa

View File

@@ -0,0 +1,2 @@
name:
default: Aruba

View File

@@ -0,0 +1,2 @@
name:
default: Aland Islands

View File

@@ -0,0 +1,2 @@
name:
default: Saint Barthélemy

View File

@@ -0,0 +1,2 @@
name:
default: "\N"

View File

@@ -0,0 +1,2 @@
name:
default: Bouvet Island

View File

@@ -0,0 +1,37 @@
name:
default: Cocos (Keeling) Islands
af: Cocos (Keeling) Eilande
ar: جزر كوكوس (كيلينغ)
be: Какосавыя (Кілінг) астравы
br: Inizi Kokoz
ca: Illes Cocos
da: Cocosøerne
de: Kokosinseln
el: Νησιά Κόκος
en: Cocos (Keeling) Islands
eo: Kokosinsuloj
es: Islas Cocos (Keeling)
et: Kookossaared
eu: Cocos (Keeling) uharteak
fa: جزایر کوکوس
fi: Kookossaaret
fr: Îles Cocos
fy: de Kokoseilannen
he: איי קוקוס (קילינג)
hr: Kokosovi otoci
hu: Kókusz (Keeling)-szigetek
id: Kepulauan Cocos (Keeling)
is: Kókoseyjar
it: Isole Cocos e Keeling
lt: Kokoso (Keelingo) salos
lv: Kokosu (Kīlinga) salas
mn: Кокосын (Кийлингийн) Арлууд
nl: Cocoseilanden
pl: Wyspy Kokosowe
ru: Кокосовые острова
sl: Kokosovi otoki
sv: Kokosöarna
tr: Cocos (Keeling) Adaları
uk: Кокосові острови
vi: Quần đảo Cocos (Keeling)
zh: 科科斯(基林)群島

View File

@@ -0,0 +1,7 @@
name:
default: Curaçao
en: Curaçao
es: Curazao
fr: Curaçao
ru: Кюрасао
sv: Curaçao

View File

@@ -0,0 +1,61 @@
name:
default: Christmas Island
af: Christmas-eiland
ar: جزيرة الميلاد
bg: Рождество
br: Enez Nedeleg
bs: Božićno ostrvo
ca: Illa Christmas
cs: Vánoční ostrov
cy: Ynys y Nadolig
da: Juleøen
de: Weihnachtsinsel
el: Νήσος των Χριστουγέννων
eo: Kristnaskinsulo
es: Isla de Navidad
et: Jõulusaar
eu: Christmas uhartea
fa: جزیره کریسمس
fi: Joulusaari
fr: Île Christmas
fy: Krysteilân
ga: Oileán na Nollag
gl: Illa de Nadal
he: טריטוריית האי חג המולד
hi: क्रिसमस आईलैंड
hr: Božićni otok
hu: Karácsony-sziget
id: Pulau Natal
is: Jólaeyja
it: Isola del Natale
ja: クリスマス島
ka: შობის კუნძული
kk: Кристмас аралы
ko: 크리스마스 섬
kw: Ynys Nadelik
lb: Chrëschtdagsinsel
lt: Kalėdų sala
lv: Ziemsvētku sala
mn: Зул Сарын Арал
mr: क्रिसमस द्वीप
ms: Pulau Krismas
nl: Christmaseiland
nn: Christmasøya
"no": Christmasøya
pl: Wyspa Bożego Narodzenia
pt: Ilha Christmas
ro: Insula Crăciunului
ru: Остров Рождества
sh: Božićni otok
sk: Vianočný ostrov
sl: Božični otoki
sr: Божићно Острво
sv: Julön
sw: Kisiwa cha Krismasi
ta: கிறிஸ்துமசு தீவு
th: เกาะคริสต์มาส
tr: Christmas Adası
uk: Острів Різдва
vi: Đảo Christmas
wo: Dunu Christmas
zh: 圣诞岛

View File

@@ -0,0 +1,41 @@
name:
default: Guyane Française
af: Frans-Guyana
ar: غيانا
br: Gwiana chall
ca: Guaiana Francesa
cy: Guyane
da: Fransk Guyana
de: Französisch-Guayana
el: Γαλλική Γουιάνα
en: French Guiana
eo: Gujano
es: Guayana Francesa
et: Prantsuse Guajaana
fa: گویان فرانسه
fi: Ranskan Guayana
fr: Guyane française
fy: Frânsk Guyana
ga: Guáin na Fraince
gd: Guiana Fhrangach
he: גיאנה הצרפתית
hr: Francuska Gvajana
hu: Francia Guyana
id: Guyana Perancis
is: Franska Gvæjana
it: Guyana francese
la: Guiana Francica
li: Frans Guyana
lt: Prancūzijos Gviana
lv: Franču Gviāna
mn: Франц Гвиана
nl: Frans-Guyana
pl: Gujana Francuska
ru: Французская Гвиана
sl: Francoska Gvajana
sv: Franska Guyana
th: เฟรนช์เกียนา
tr: Fransız Guyanası
uk: Французька Гвіана
vi: Guyane thuộc Pháp
zh: 法属圭亚那

View File

@@ -0,0 +1,31 @@
name:
default: Guadeloupe
ar: غوادلوب
be: Гвадэлупа
br: Gwadeloup
ca: Illa de Guadalupe
da: Guadeloupe
el: Γουαδελούπη
en: Guadeloupe
eo: Gvadelupo
es: Guadalupe
fa: گوادلوپ
fi: Guadeloupe
fr: Guadeloupe
fy: Guadelûp
ga: Guadalúip
he: גוואדלופ
hr: Gvadalupa
hu: Guadeloupe
is: Gvadelúpeyjar
it: Guadalupa
la: Guadalupa
lt: Gvadelupa
lv: Gvadelupa
mn: Гуаделупе
pl: Gwadelupa
ru: Гваделупа
sv: Guadeloupe
th: กวาเดอลูป
uk: Гваделупа
zh: 瓜德罗普

View File

@@ -0,0 +1,2 @@
name:
default: Guam

View File

@@ -0,0 +1,2 @@
name:
default: Hong Kong

View File

@@ -0,0 +1,2 @@
name:
default: Heard Island and MaxDonald Islands

View File

@@ -0,0 +1,2 @@
name:
default: Saint Martin

View File

@@ -0,0 +1,2 @@
name:
default: Macao

View File

@@ -0,0 +1,2 @@
name:
default: Northern Mariana Islands

View File

@@ -0,0 +1,30 @@
name:
default: Martinique
ar: مارتينيك
be: Марцініка
br: Martinik
ca: Martinica
da: Martinique
el: Μαρτινίκα
en: Martinique
eo: Martiniko
es: Martinica
fa: مارتینیک
fi: Martinique
fr: Martinique
fy: Martinyk
he: מרטיניק
hr: Martinik
hu: Martinique
id: Martinik
is: Martinique
it: Martinica
la: Martinica
lt: Martinika
lv: Martinika
mn: Мартиник
pl: Martynika
ru: Мартиника
sv: Martinique
uk: Мартиніка
zh: 馬提尼克

View File

@@ -0,0 +1,37 @@
name:
default: Nouvelle-Calédonie
af: Nieu-Caledonia
ar: كاليدونيا الجديدة
be: Новая Каледонія
br: Kaledonia Nevez
ca: Nova Caledònia
cy: Caledonia Newydd
da: Ny Kaledonien
de: Neukaledonien
el: Νέα Καληδονία
en: New Caledonia
eo: Nov-Kaledonio
es: Nueva Caledonia
fa: کالدونیای جدید
fi: Uusi-Kaledonia
fr: Nouvelle-Calédonie
ga: An Nua-Chaladóin
he: קלדוניה החדשה
hr: Nova Kaledonija
hu: Új-Kaledónia
id: Kaledonia Baru
is: Nýja-Kaledónía
it: Nuova Caledonia
la: Nova Caledonia
lt: Naujoji Kaledonija
lv: Jaunkaledonija
mn: Шинэ Каледони
nl: Nieuw-Caledonië
pl: Nowa Kaledonia
ru: Новая Каледония
sl: Nova Kaledonija
sv: Nya Kaledonien
th: นิวแคลิโดเนีย
tr: Yeni Kaledonya
uk: Нова Каледонія
zh: 新喀里多尼亚

View File

@@ -0,0 +1,36 @@
name:
default: Norfolk Island
af: Norfolkeiland
ar: جزيرة نورفولك
be: Норфалк
br: Enez Norfolk
ca: Illa Norfolk
cy: Ynys Norfolk
da: Norfolk-øen
de: Norfolkinsel
en: Norfolk Island
eo: Norfolkinsulo
es: Isla Norfolk
et: Norfolki saar
fi: Norfolkinsaari
fr: Île Norfolk
fy: Norfolk
ga: Oileán Norfolk
he: האי נורפוק
hr: Otok Norfolk
hu: Norfolk-sziget
id: Pulau Norfolk
is: Norfolkeyja
it: Isola Norfolk
la: Insula Norfolcia
lt: Norfolko sala
lv: Norfolkas sala
mn: Норфолк Арал
nl: Norfolk
pl: Wyspa Norfolk
ru: Остров Норфолк
sv: Norfolkön
tr: Norfolk Adası
uk: Острів Норфолк
vi: Đảo Norfolk
zh: 诺福克岛

View File

@@ -0,0 +1,77 @@
name:
default: Polynésie française
af: Franse Polynesië
an: Polinesia Franzesa
ar: بولونيزيا الفرنسية
az: Fransa Polineziyası
be: Французская Палінезія
bg: Френска Полинезия
br: Polinezia Frañs
bs: Francuska Polinezija
ca: Polinèsia Francesa
cs: Francouzská Polynésie
cy: Polynesia Ffrengig
da: Fransk Polynesien
de: Französisch-Polynesien
dv: ފަރަންސޭސި ޕޮލިނޭޝިއާ
el: Γαλλική Πολυνησία
en: French Polynesia
eo: Franca Polinezio
es: Polinesia Francesa
et: Prantsuse Polüneesia
eu: Frantziar Polinesia
fa: پلی‌نزی فرانسه
fi: Ranskan Polynesia
fr: Polynésie française
fy: Frânsk Polyneezje
ga: Polainéis na Fraince
gd: French Polynesia
gl: Polinesia francesa
he: פולינזיה הצרפתית
hi: फ्रेंच पोलीनेशिया
hr: Francuska Polinezija
hu: Francia Polinézia
id: Polinesia Perancis
io: Franca Polinezia
is: Franska Pólýnesía
it: Polinesia francese
ja: フランス領ポリネシア
jv: Polinesia Perancis
kk: Франция Полинезиясы
ko: 프랑스령 폴리네시아
kw: Polynesi Frynkek
la: Polynesia Francica
lb: Franséisch-Polynesien
lt: Prancūzijos Polinezija
lv: Franču Polinēzija
mi: Porinīhia Wīwī
mk: Француска Полинезија
mn: Францын Полинез
mr: फ्रेंच पॉलिनेशिया
ms: Polinesia Perancis
nl: Frans-Polynesië
nn: Fransk Polynesia
"no": Fransk Polynesia
oc: Polinesia Francesa
os: Францы Полинези
pl: Polinezja Francuska
pt: Polinésia Francesa
qu: Phransis Pulinisya
ro: Polinezia Franceză
ru: Французская Полинезия
se: Frankriikka Polynesia
sh: Francuska Polinezija
sk: Francúzska Polynézia
sl: Francoska Polinezija
sr: Француска Полинезија
sv: Franska Polynesien
sw: Polynesia ya Kifaransa
ta: பிரெஞ்சு பொலினீசியா
th: เฟรนช์โปลินีเซีย
tr: Fransız Polinezyası
ty: Pōrīnetia Farāni
ug: Fransiyige Qarashliq Polinéziye
uk: Французька Полінезія
vi: Polynésie thuộc Pháp
wo: Polineesi gu Faraas
zh: 法属波利尼西亚

View File

@@ -0,0 +1,19 @@
name:
default: Saint-Pierre-et-Miquelon
af: Saint-Pierre et Miquelon
be: Святы П’ер і Міквелон
da: Saint Pierre og Miquelon
de: Saint-Pierre und Miquelon
en: Saint Pierre and Miquelon
eo: Sankta-Piero kaj Mikelono
es: San Pedro y Miguelón
fi: Saint-Pierre ja Miquelon
fr: Saint-Pierre-et-Miquelon
hr: Sveti Petar i Mikelon
hu: Saint-Pierre és Miquelon
lt: Sen Pjeras ir Mikelonas
lv: Senpjēra un Mikelona
mn: Сент Пьер ба Микелон
sv: Saint-Pierre och Miquelon
tr: Saint-Pierre ve Miquelon
uk: Сен-П'єр і Мікелон

View File

@@ -0,0 +1,2 @@
name:
default: Puerto Rico

View File

@@ -0,0 +1,29 @@
name:
default: Réunion
af: Réunion
ar: ريونيون
be: Руньён
br: Ar Reunion
ca: Illa de la Reunió
da: Reunion
el: Ρεϊνιόν
eo: Reunio
es: La Reunión
fa: رئونیون
fi: Réunion
fr: La Réunion
he: ראוניון
hu: Réunion
is: Réunion
it: Riunione
la: Reunio
lt: Reunionas
lv: Reinjona
mn: Реюньон
pl: Reunion
ru: Реюньон
sl: Reunion
sv: Réunion
th: เรอูนียง
uk: Реюньйон
zh: 留尼汪

View File

@@ -0,0 +1,2 @@
name:
default: Svalbard and Jan Mayen

View File

@@ -0,0 +1,2 @@
name:
default: Sint Maarten

View File

@@ -0,0 +1,48 @@
name:
default: Terres australes et antarctiques françaises
af: Franse Suidelike en Antarktiese Gebiede
an: Territorios Australs Franzeses
ar: الأراضي الجنوبية الفرنسية
be: Французскія Паўднёвыя тэрыторыі
bg: Френски южни и антарктически територии
br: Douaroù Aostral hag Antarktikel Frañs
ca: Terres Australs i Antàrtiques Franceses
cs: Francouzská jižní a antarktická území
da: Franske sydlige og Antarktiske territorier
de: Französische Süd- und Antarktisgebiete
el: Γαλλικά νότια και ανταρκτικά εδάφη
en: French Southern Lands
eo: Francaj Sudaj Teritorioj
es: Tierras Australes y Antárticas Francesas
eu: Frantziaren lurralde austral eta antartikoak
fi: Ranskan eteläiset ja antarktiset alueet
fr: Terres australes et antarctiques françaises
fy: Frânske Súdlike en Antarktyske Lannen
gl: Terras Austrais e Antárticas Francesas
hr: Francuski južni i antarktički teritoriji
hu: Francia déli és antarktiszi területek
id: Daratan Selatan dan Antarktika Perancis
is: Frönsku suðlægu landsvæðin
it: Terre Australi e Antartiche Francesi
ja: フランス領南方・南極地域
ko: 프랑스령 남부와 남극 지역
kw: Tiryow Deghow hag Antarktik Frynkek
lt: Prancūzijos Pietų Sritys
lv: Francijas Dienvidjūru un Antarktikas Zemes
nl: Franse Zuidelijke en Antarctische Gebieden
"no": De franske sørterritorier
oc: Tèrras Australas e Antarticas Francesas
pl: Francuskie Terytoria Południowe i Antarktyczne
pt: Terras Austrais e Antárticas Francesas
ro: Teritoriile australe şi antarctice franceze
ru: Французские Южные и Антарктические территории
sh: Francuske Južne Teritorije
sk: Francúzske južné a antarktické územia
sl: Francoske južne in antarktične dežele
sr: Француске јужне и антарктичке земље
sv: Franska sydterritorierna
ta: பிரெஞ்சு தென்னக நிலங்களும் அண்டாடிக் நிலமும்
tr: Fransız Güney ve Antarktika Toprakları
uk: Французькі Південні та Антарктичні території
vi: Vùng đất phía Nam và châu Nam Cực thuộc Pháp
zh: 法属南部领地

View File

@@ -0,0 +1,2 @@
name:
default: United States Minor Outlying Islands

View File

@@ -0,0 +1,2 @@
name:
default: United States Virgin Islands

View File

@@ -0,0 +1,68 @@
name:
default: Wallis-et-Futuna
af: Wallis-en-Futuna
an: Wallis e Futuna
ar: جزر واليس وفوتونا
be: Уоліс і Футуна
bg: Уолис и Футуна
br: Wallis ha Futuna
ca: Wallis i Futuna
cs: Wallis a Futuna
cy: Wallis a Futuna
da: Wallis og Futuna
de: Wallis und Futuna
dv: ވާލީ އަދި ފުތޫނާ
el: Ουώλλις και Φουτούνα
en: Wallis and Futuna Islands
eo: Valiso kaj Futuno
es: Wallis y Futuna
et: Wallis ja Futuna
eu: Wallis eta Futuna
fa: والیس و فوتونا
fi: Wallis- ja Futunasaaret
fr: Wallis-et-Futuna
fy: Wallis en Fûtûna
ga: Vailís agus Futúna
gl: Wallis e Futuna
he: ואליס ופוטונה
hr: Wallis i Futuna
hu: Wallis és Futuna
id: Wallis dan Futuna
io: Wallis e Futuna Insuli
is: Wallis- og Fútúnaeyjar
it: Wallis e Futuna
ja: ウォリス・フツナ
jv: Wallis lan Futuna
ko: 왈리스 퓌튀나
kw: Wallis ha Futuna
la: Vallis et Futuna
lb: Wallis a Futuna
lt: Walliso ir Futuna salos
lv: Volisa un Futuna
mn: Уоллис ба Футуна
mr: वालिस व फुतुना
ms: Wallis dan Futuna
nl: Wallis en Futuna
nn: Wallis- og Futunaøyane
"no": Wallis- og Futunaøyene
oc: Wallis e Futuna
pl: Wallis i Futuna
pt: Wallis e Futuna
ro: Wallis şi Futuna
ru: Уоллис и Футуна
se: Wallis ja Futuna
sh: Wallis i Futuna
sk: Wallis a Futuna
sl: Wallis in Futuna
sm: Wallis and Futuna
sr: Валис и Футуна
sv: Wallis- och Futunaöarna
sw: Wallis na Futuna
ta: வலிசும் புட்டூனாவும்
th: หมู่เกาะวาลลิสและหมู่เกาะฟุตูนา
tr: Wallis ve Futuna Adaları
ug: Wallis we Futuna Taqim Aralliri
uk: Волліс і Футуна
vi: Wallis và Futuna
wo: Wallis ak Futuna
zh: 瓦利斯和富图纳群岛

View File

@@ -0,0 +1,2 @@
name:
default: Mayotte

View File

@@ -61,6 +61,13 @@ am:
pattern: "dddd"
# Netherlands Antilles (De Nederlandse Antillen)
an:
partition: 58
languages: nl, en, pap
names: !include country-names/an.yaml
# Angola (Angola)
ao:
partition: 85
@@ -69,6 +76,14 @@ ao:
postcode: no
# (Antarctica)
aq:
partition: 181
languages: en, es, fr, ru
names: !include country-names/aq.yaml
postcode: no
# Argentina (Argentina)
ar:
partition: 39
@@ -78,6 +93,13 @@ ar:
pattern: "l?dddd(?:lll)?"
# (American Samoa)
as:
partition: 182
languages: en, sm
names: !include country-names/as.yaml
# Austria (Österreich)
at:
partition: 245
@@ -96,6 +118,21 @@ au:
pattern: "dddd"
# (Aruba)
aw:
partition: 183
languages: nl, pap
names: !include country-names/aw.yaml
postcode: no
# (Aland Islands)
ax:
partition: 184
languages: sv
names: !include country-names/ax.yaml
# Azerbaijan (Azərbaycan)
az:
partition: 119
@@ -184,6 +221,13 @@ bj:
postcode: no
# (Saint Barthélemy)
bl:
partition: 204
languages: fr
names: !include country-names/bl.yaml
# Bermuda (Bermuda)
bm:
partition: 176
@@ -212,6 +256,13 @@ bo:
postcode: no
# Caribbean Netherlands (Caribisch Nederland)
bq:
partition: 250
languages: nl
names: !include country-names/bq.yaml
# Brazil (Brasil)
br:
partition: 121
@@ -239,6 +290,13 @@ bt:
pattern: "ddddd"
# (Bouvet Island)
bv:
partition: 185
languages: "no"
names: !include country-names/bv.yaml
# Botswana (Botswana)
bw:
partition: 122
@@ -274,6 +332,13 @@ ca:
output: \1 \2
# Cocos (Keeling) Islands (Cocos (Keeling) Islands)
cc:
partition: 118
languages: en
names: !include country-names/cc.yaml
# Democratic Republic of the Congo (République démocratique du Congo)
cd:
partition: 229
@@ -385,6 +450,20 @@ cv:
pattern: "dddd"
# Curaçao (Curaçao)
cw:
partition: 248
languages: nl, en
names: !include country-names/cw.yaml
# Christmas Island (Christmas Island)
cx:
partition: 177
languages: en
names: !include country-names/cx.yaml
# Cyprus (Κύπρος - Kıbrıs)
cy:
partition: 114
@@ -604,6 +683,13 @@ ge:
pattern: "dddd"
# French Guiana (Guyane Française)
gf:
partition: 231
languages: fr
names: !include country-names/gf.yaml
# Guernsey (Guernsey)
gg:
partition: 77
@@ -659,6 +745,13 @@ gn:
pattern: "ddd"
# Guadeloupe (Guadeloupe)
gp:
partition: 232
languages: fr
names: !include country-names/gp.yaml
# Equatorial Guinea (Guinea Ecuatorial)
gq:
partition: 12
@@ -696,6 +789,13 @@ gt:
pattern: "ddddd"
# Guam (Guam)
gu:
partition: 187
languages: en, ch
names: !include country-names/gu.yaml
# Guinea-Bissau (Guiné-Bissau)
gw:
partition: 8
@@ -713,6 +813,20 @@ gy:
postcode: no
# (Hong Kong)
hk:
partition: 188
languages: zh-hant, en
names: !include country-names/hk.yaml
# (Heard Island and MaxDonald Islands)
hm:
partition: 189
languages: en
names: !include country-names/hm.yaml
# Honduras (Honduras)
hn:
partition: 56
@@ -1115,6 +1229,13 @@ me:
pattern: "ddddd"
# Saint Martin (Saint Martin)
mf:
partition: 203
languages: fr
names: !include country-names/mf.yaml
# Madagascar (Madagasikara)
mg:
partition: 164
@@ -1168,6 +1289,28 @@ mn:
pattern: "ddddd"
# Macao (Macao)
mo:
partition: 191
languages: zh-hant, pt
names: !include country-names/mo.yaml
postcode: no
# Northern Mariana Islands (Northern Mariana Islands)
mp:
partition: 192
languages: ch, en
names: !include country-names/mp.yaml
# Martinique (Martinique)
mq:
partition: 233
languages: fr
names: !include country-names/mq.yaml
# Mauritania (موريتانيا)
mr:
partition: 149
@@ -1255,6 +1398,13 @@ na:
pattern: "ddddd"
# New Caledonia (Nouvelle-Calédonie)
nc:
partition: 234
languages: fr
names: !include country-names/nc.yaml
# Niger (Niger)
ne:
partition: 226
@@ -1264,6 +1414,13 @@ ne:
pattern: "dddd"
# Norfolk Island (Norfolk Island)
nf:
partition: 100
languages: en, pih
names: !include country-names/nf.yaml
# Nigeria (Nigeria)
ng:
partition: 218
@@ -1362,6 +1519,13 @@ pe:
pattern: "ddddd"
# French Polynesia (Polynésie française)
pf:
partition: 202
languages: fr
names: !include country-names/pf.yaml
# Papua New Guinea (Papua Niugini)
pg:
partition: 71
@@ -1399,6 +1563,13 @@ pl:
output: \1-\2
# Saint Pierre and Miquelon (Saint-Pierre-et-Miquelon)
pm:
partition: 236
languages: fr
names: !include country-names/pm.yaml
# Pitcairn Islands (Pitcairn Islands)
pn:
partition: 113
@@ -1409,6 +1580,13 @@ pn:
output: \1 \2
# Puerto Rico (Puerto Rico)
pr:
partition: 193
languages: es, en
names: !include country-names/pr.yaml
# Palestinian Territory (Palestinian Territory)
ps:
partition: 194
@@ -1453,6 +1631,13 @@ qa:
postcode: no
# (Réunion)
re:
partition: 235
languages: fr
names: !include country-names/re.yaml
# Romania (România)
ro:
partition: 170
@@ -1560,6 +1745,13 @@ si:
pattern: "dddd"
# (Svalbard and Jan Mayen)
sj:
partition: 197
languages: "no"
names: !include country-names/sj.yaml
# Slovakia (Slovensko)
sk:
partition: 172
@@ -1639,6 +1831,13 @@ sv:
pattern: "dddd"
# (Sint Maarten)
sx:
partition: 249
languages: nl, en
names: !include country-names/sx.yaml
# Syria (سوريا)
sy:
partition: 104
@@ -1674,6 +1873,13 @@ td:
postcode: no
# French Southern Lands (Terres australes et antarctiques françaises)
tf:
partition: 132
languages: fr
names: !include country-names/tf.yaml
# Togo (Togo)
tg:
partition: 243
@@ -1803,6 +2009,15 @@ ug:
postcode: no
# (United States Minor Outlying Islands)
um:
partition: 198
languages: en
names: !include country-names/um.yaml
postcode:
pattern: "96898"
# United States (United States)
us:
partition: 2
@@ -1868,6 +2083,13 @@ vg:
output: VG\1
# (United States Virgin Islands)
vi:
partition: 199
languages: en
names: !include country-names/vi.yaml
# Vietnam (Việt Nam)
vn:
partition: 75
@@ -1885,6 +2107,13 @@ vu:
postcode: no
# Wallis and Futuna Islands (Wallis-et-Futuna)
wf:
partition: 238
languages: fr
names: !include country-names/wf.yaml
# Samoa (Sāmoa)
ws:
partition: 131
@@ -1909,6 +2138,13 @@ ye:
postcode: no
# Mayotte (Mayotte)
yt:
partition: 200
languages: fr
names: !include country-names/yt.yaml
# South Africa (South Africa)
za:
partition: 76

View File

@@ -33,15 +33,6 @@ NOMINATIM_MAX_WORD_FREQUENCY=50000
# If true, admin level changes on places with many contained children are blocked.
NOMINATIM_LIMIT_REINDEXING=yes
# When set to 'yes' changes to OSM objects will be propagated to dependent
# objects. This means that geometries of way/relations are updated when a
# node on a way or a way in a relation changes.
# EXPERT ONLY: Use with care. Enabling this option might lead to significantly
# more load when updates are applied.
# Do not enable this option on an existing database.
# The default is to not propagate these changes.
NOMINATIM_UPDATE_FORWARD_DEPENDENCIES=no
# Restrict search languages.
# Normally Nominatim will include all language variants of name:XX
# in the search index. Set this to a comma separated list of language

View File

@@ -1,510 +0,0 @@
-- Core functions for Nominatim import flex style.
--
local module = {}
local PRE_DELETE = nil
local PRE_EXTRAS = nil
local MAIN_KEYS = nil
local NAMES = nil
local ADDRESS_TAGS = nil
local SAVE_EXTRA_MAINS = false
local POSTCODE_FALLBACK = true
-- The single place table.
local place_table = osm2pgsql.define_table{
name = "place",
ids = { type = 'any', id_column = 'osm_id', type_column = 'osm_type' },
columns = {
{ column = 'class', type = 'text', not_null = true },
{ column = 'type', type = 'text', not_null = true },
{ column = 'admin_level', type = 'smallint' },
{ column = 'name', type = 'hstore' },
{ column = 'address', type = 'hstore' },
{ column = 'extratags', type = 'hstore' },
{ column = 'geometry', type = 'geometry', projection = 'WGS84', not_null = true },
},
indexes = {}
}
------------ Geometry functions for relations ---------------------
function module.relation_as_multipolygon(o)
return o:as_multipolygon()
end
function module.relation_as_multiline(o)
return o:as_multilinestring():line_merge()
end
module.RELATION_TYPES = {
multipolygon = module.relation_as_multipolygon,
boundary = module.relation_as_multipolygon,
waterway = module.relation_as_multiline
}
------------- Place class ------------------------------------------
local Place = {}
Place.__index = Place
function Place.new(object, geom_func)
local self = setmetatable({}, Place)
self.object = object
self.geom_func = geom_func
self.admin_level = tonumber(self.object:grab_tag('admin_level'))
if self.admin_level == nil
or self.admin_level <= 0 or self.admin_level > 15
or math.floor(self.admin_level) ~= self.admin_level then
self.admin_level = 15
end
self.num_entries = 0
self.has_name = false
self.names = {}
self.address = {}
self.extratags = {}
return self
end
function Place:clean(data)
for k, v in pairs(self.object.tags) do
if data.delete ~= nil and data.delete(k, v) then
self.object.tags[k] = nil
elseif data.extra ~= nil and data.extra(k, v) then
self.extratags[k] = v
self.object.tags[k] = nil
end
end
end
function Place:delete(data)
if data.match ~= nil then
for k, v in pairs(self.object.tags) do
if data.match(k, v) then
self.object.tags[k] = nil
end
end
end
end
function Place:grab_extratags(data)
local count = 0
if data.match ~= nil then
for k, v in pairs(self.object.tags) do
if data.match(k, v) then
self.object.tags[k] = nil
self.extratags[k] = v
count = count + 1
end
end
end
return count
end
local function strip_address_prefix(k)
if k:sub(1, 5) == 'addr:' then
return k:sub(6)
end
if k:sub(1, 6) == 'is_in:' then
return k:sub(7)
end
return k
end
function Place:grab_address_parts(data)
local count = 0
if data.groups ~= nil then
for k, v in pairs(self.object.tags) do
local atype = data.groups(k, v)
if atype ~= nil then
if atype == 'main' then
self.has_name = true
self.address[strip_address_prefix(k)] = v
count = count + 1
elseif atype == 'extra' then
self.address[strip_address_prefix(k)] = v
else
self.address[atype] = v
end
self.object.tags[k] = nil
end
end
end
return count
end
function Place:grab_name_parts(data)
local fallback = nil
if data.groups ~= nil then
for k, v in pairs(self.object.tags) do
local atype = data.groups(k, v)
if atype ~= nil then
self.names[k] = v
self.object.tags[k] = nil
if atype == 'main' then
self.has_name = true
elseif atype == 'house' then
self.has_name = true
fallback = {'place', 'house', 'always'}
end
end
end
end
return fallback
end
function Place:write_place(k, v, mtype, save_extra_mains)
if mtype == nil then
return 0
end
v = v or self.object.tags[k]
if v == nil then
return 0
end
if type(mtype) == 'table' then
mtype = mtype[v] or mtype[1]
end
if mtype == 'always' or (self.has_name and mtype == 'named') then
return self:write_row(k, v, save_extra_mains)
end
if mtype == 'named_with_key' then
local names = {}
local prefix = k .. ':name'
for namek, namev in pairs(self.object.tags) do
if namek:sub(1, #prefix) == prefix
and (#namek == #prefix
or namek:sub(#prefix + 1, #prefix + 1) == ':') then
names[namek:sub(#k + 2)] = namev
end
end
if next(names) ~= nil then
local saved_names = self.names
self.names = names
local results = self:write_row(k, v, save_extra_mains)
self.names = saved_names
return results
end
end
return 0
end
function Place:write_row(k, v, save_extra_mains)
if self.geometry == nil then
self.geometry = self.geom_func(self.object)
end
if self.geometry:is_null() then
return 0
end
if save_extra_mains ~= nil then
for extra_k, extra_v in pairs(self.object.tags) do
if extra_k ~= k and save_extra_mains(extra_k, extra_v) then
self.extratags[extra_k] = extra_v
end
end
end
place_table:insert{
class = k,
type = v,
admin_level = self.admin_level,
name = next(self.names) and self.names,
address = next(self.address) and self.address,
extratags = next(self.extratags) and self.extratags,
geometry = self.geometry
}
if save_extra_mains then
for k, v in pairs(self.object.tags) do
if save_extra_mains(k, v) then
self.extratags[k] = nil
end
end
end
self.num_entries = self.num_entries + 1
return 1
end
function module.tag_match(data)
if data == nil or next(data) == nil then
return nil
end
local fullmatches = {}
local key_prefixes = {}
local key_suffixes = {}
if data.keys ~= nil then
for _, key in pairs(data.keys) do
if key:sub(1, 1) == '*' then
if #key > 1 then
if key_suffixes[#key - 1] == nil then
key_suffixes[#key - 1] = {}
end
key_suffixes[#key - 1][key:sub(2)] = true
end
elseif key:sub(#key, #key) == '*' then
if key_prefixes[#key - 1] == nil then
key_prefixes[#key - 1] = {}
end
key_prefixes[#key - 1][key:sub(1, #key - 1)] = true
else
fullmatches[key] = true
end
end
end
if data.tags ~= nil then
for k, vlist in pairs(data.tags) do
if fullmatches[k] == nil then
fullmatches[k] = {}
for _, v in pairs(vlist) do
fullmatches[k][v] = true
end
end
end
end
return function (k, v)
if fullmatches[k] ~= nil and (fullmatches[k] == true or fullmatches[k][v] ~= nil) then
return true
end
for slen, slist in pairs(key_suffixes) do
if #k >= slen and slist[k:sub(-slen)] ~= nil then
return true
end
end
for slen, slist in pairs(key_prefixes) do
if #k >= slen and slist[k:sub(1, slen)] ~= nil then
return true
end
end
return false
end
end
function module.tag_group(data)
if data == nil or next(data) == nil then
return nil
end
local fullmatches = {}
local key_prefixes = {}
local key_suffixes = {}
for group, tags in pairs(data) do
for _, key in pairs(tags) do
if key:sub(1, 1) == '*' then
if #key > 1 then
if key_suffixes[#key - 1] == nil then
key_suffixes[#key - 1] = {}
end
key_suffixes[#key - 1][key:sub(2)] = group
end
elseif key:sub(#key, #key) == '*' then
if key_prefixes[#key - 1] == nil then
key_prefixes[#key - 1] = {}
end
key_prefixes[#key - 1][key:sub(1, #key - 1)] = group
else
fullmatches[key] = group
end
end
end
return function (k, v)
local val = fullmatches[k]
if val ~= nil then
return val
end
for slen, slist in pairs(key_suffixes) do
if #k >= slen then
val = slist[k:sub(-slen)]
if val ~= nil then
return val
end
end
end
for slen, slist in pairs(key_prefixes) do
if #k >= slen then
val = slist[k:sub(1, slen)]
if val ~= nil then
return val
end
end
end
end
end
-- Process functions for all data types
function module.process_node(object)
local function geom_func(o)
return o:as_point()
end
module.process_tags(Place.new(object, geom_func))
end
function module.process_way(object)
local function geom_func(o)
local geom = o:as_polygon()
if geom:is_null() then
geom = o:as_linestring()
end
return geom
end
module.process_tags(Place.new(object, geom_func))
end
function module.process_relation(object)
local geom_func = module.RELATION_TYPES[object.tags.type]
if geom_func ~= nil then
module.process_tags(Place.new(object, geom_func))
end
end
-- The process functions are used by default by osm2pgsql.
osm2pgsql.process_node = module.process_node
osm2pgsql.process_way = module.process_way
osm2pgsql.process_relation = module.process_relation
function module.process_tags(o)
o:clean{delete = PRE_DELETE, extra = PRE_EXTRAS}
-- Exception for boundary/place double tagging
if o.object.tags.boundary == 'administrative' then
o:grab_extratags{match = function (k, v)
return k == 'place' and v:sub(1,3) ~= 'isl'
end}
end
-- name keys
local fallback = o:grab_name_parts{groups=NAMES}
-- address keys
if o:grab_address_parts{groups=ADDRESS_TAGS} > 0 and fallback == nil then
fallback = {'place', 'house', 'always'}
end
if o.address.country ~= nil and #o.address.country ~= 2 then
o.address['country'] = nil
end
if POSTCODE_FALLBACK and fallback == nil and o.address.postcode ~= nil then
fallback = {'place', 'postcode', 'always'}
end
if o.address.interpolation ~= nil then
o:write_place('place', 'houses', 'always', SAVE_EXTRA_MAINS)
return
end
o:clean{delete = POST_DELETE}
-- collect main keys
for k, v in pairs(o.object.tags) do
local ktype = MAIN_KEYS[k]
if ktype == 'fallback' then
if o.has_name then
fallback = {k, v, 'named'}
end
elseif ktype ~= nil then
o:write_place(k, v, MAIN_KEYS[k], SAVE_EXTRA_MAINS)
end
end
if fallback ~= nil and o.num_entries == 0 then
o:write_place(fallback[1], fallback[2], fallback[3], SAVE_EXTRA_MAINS)
end
end
--------- Convenience functions for simple style configuration -----------------
function module.set_prefilters(data)
PRE_DELETE = module.tag_match{keys = data.delete_keys, tags = data.delete_tags}
PRE_EXTRAS = module.tag_match{keys = data.extratag_keys,
tags = data.extratag_tags}
end
function module.set_main_tags(data)
MAIN_KEYS = data
end
function module.set_name_tags(data)
NAMES = module.tag_group(data)
end
function module.set_address_tags(data)
if data.postcode_fallback ~= nil then
POSTCODE_FALLBACK = data.postcode_fallback
data.postcode_fallback = nil
end
ADDRESS_TAGS = module.tag_group(data)
end
function module.set_unused_handling(data)
if data.extra_keys == nil and data.extra_tags == nil then
POST_DELETE = module.tag_match{keys = data.delete_keys, tags = data.delete_tags}
SAVE_EXTRA_MAINS = function() return true end
elseif data.delete_keys == nil and data.delete_tags == nil then
POST_DELETE = nil
SAVE_EXTRA_MAINS = module.tag_match{keys = data.extra_keys, tags = data.extra_tags}
else
error("unused handler can have only 'extra_keys' or 'delete_keys' set.")
end
end
function set_relation_types(data)
module.RELATION_TYPES = {}
for k, v in data do
if v == 'multipolygon' then
module.RELATION_TYPES[k] = module.relation_as_multipolygon
elseif v == 'multiline' then
module.RELATION_TYPES[k] = module.relation_as_multiline
end
end
end
return module

View File

@@ -24,7 +24,6 @@ transliteration:
- ":: lower ()"
- "[^a-z0-9[:Space:]] >"
- ":: NFC ()"
- "[:Space:]+ > ' '"
sanitizers:
- step: clean-housenumbers
filter-kind:
@@ -36,9 +35,7 @@ sanitizers:
- step: clean-postcodes
convert-to-address: yes
default-pattern: "[A-Z0-9- ]{3,12}"
- step: clean-tiger-tags
- step: split-name-list
delimiters: ;
- step: strip-brace-terms
- step: tag-analyzer-by-language
filter-kind: [".*name.*"]

View File

@@ -1,67 +0,0 @@
flex = require('flex-base')
flex.set_main_tags{
highway = {'always',
street_lamp = 'named',
traffic_signals = 'named',
service = 'named',
cycleway = 'named',
path = 'named',
footway = 'named',
steps = 'named',
bridleway = 'named',
track = 'named',
motorway_link = 'named',
trunk_link = 'named',
primary_link = 'named',
secondary_link = 'named',
tertiary_link = 'named'},
boundary = {administrative = 'named',
postal_code = 'always'},
landuse = 'fallback',
place = 'always'
}
flex.set_prefilters{delete_keys = {'building', 'source',
'source', '*source', 'type',
'is_in:postcode', '*:wikidata', '*:wikipedia',
'*:prefix', '*:suffix', 'name:prefix:*', 'name:suffix:*',
'name:etymology', 'name:signed', 'name:botanical',
'addr:street:name', 'addr:street:type'},
delete_tags = {highway = {'no', 'turning_circle', 'mini_roundabout',
'noexit', 'crossing', 'give_way', 'stop'},
landuse = {'cemetry', 'no'},
boundary = {'place'}},
extratag_keys = {'wikipedia', 'wikipedia:*', 'wikidata', 'capital', 'area'}
}
flex.set_name_tags{main = {'name', 'name:*',
'int_name', 'int_name:*',
'nat_name', 'nat_name:*',
'reg_name', 'reg_name:*',
'loc_name', 'loc_name:*',
'old_name', 'old_name:*',
'alt_name', 'alt_name:*', 'alt_name_*',
'official_name', 'official_name:*',
'place_name', 'place_name:*',
'short_name', 'short_name:*'},
extra = {'ref', 'int_ref', 'nat_ref', 'reg_ref',
'loc_ref', 'old_ref',
'iata', 'icao', 'pcode', 'pcode:*', 'ISO3166-2'},
house = {'addr:housename'}
}
flex.set_address_tags{main = {'addr:housenumber',
'addr:conscriptionnumber',
'addr:streetnumber'},
extra = {'addr:*', 'is_in:*', 'tiger:county'},
postcode = {'postal_code', 'postcode', 'addr:postcode',
'tiger:zip_left', 'tiger:zip_right'},
country = {'country_code', 'ISO3166-1',
'addr:country_code', 'is_in:country_code',
'addr:country', 'is_in:country'},
interpolation = {'addr:interpolation'}
}
flex.set_unused_handling{extra_keys = {'place'}}

View File

@@ -1,44 +0,0 @@
local flex = require('flex-base')
flex.set_main_tags{
boundary = {administrative = 'named'},
landuse = 'fallback',
place = 'always'
}
flex.set_prefilters{delete_keys = {'building', 'source', 'highway',
'addr:housenumber', 'addr:street', 'addr:city',
'source', '*source', 'type',
'is_in:postcode', '*:wikidata', '*:wikipedia',
'*:prefix', '*:suffix', 'name:prefix:*', 'name:suffix:*',
'name:etymology', 'name:signed', 'name:botanical',
'addr:street:name', 'addr:street:type'},
delete_tags = {landuse = {'cemetry', 'no'},
boundary = {'place'}},
extratag_keys = {'wikipedia', 'wikipedia:*', 'wikidata', 'capital'}
}
flex.set_name_tags{main = {'name', 'name:*',
'int_name', 'int_name:*',
'nat_name', 'nat_name:*',
'reg_name', 'reg_name:*',
'loc_name', 'loc_name:*',
'old_name', 'old_name:*',
'alt_name', 'alt_name:*', 'alt_name_*',
'official_name', 'official_name:*',
'place_name', 'place_name:*',
'short_name', 'short_name:*'},
extra = {'ref', 'int_ref', 'nat_ref', 'reg_ref',
'loc_ref', 'old_ref',
'iata', 'icao', 'pcode', 'pcode:*', 'ISO3166-2'}
}
flex.set_address_tags{extra = {'addr:*', 'is_in:*'},
postcode = {'postal_code', 'postcode', 'addr:postcode'},
country = {'country_code', 'ISO3166-1',
'addr:country_code', 'is_in:country_code',
'addr:country', 'is_in:country'},
postcode_fallback = false
}
flex.set_unused_handling{extra_keys = {'place'}}

View File

@@ -1,113 +0,0 @@
flex = require('flex-base')
flex.set_main_tags{
building = 'fallback',
emergency = 'always',
healthcare = 'fallback',
historic = 'always',
military = 'always',
natural = 'named',
landuse = 'named',
highway = {'always',
street_lamp = 'named',
traffic_signals = 'named',
service = 'named',
cycleway = 'named',
path = 'named',
footway = 'named',
steps = 'named',
bridleway = 'named',
track = 'named',
motorway_link = 'named',
trunk_link = 'named',
primary_link = 'named',
secondary_link = 'named',
tertiary_link = 'named'},
railway = 'named',
man_made = 'always',
aerialway = 'always',
boundary = {'named',
postal_code = 'always'},
aeroway = 'always',
amenity = 'always',
club = 'always',
craft = 'always',
junction = 'fallback',
landuse = 'fallback',
leisure = 'always',
office = 'always',
mountain_pass = 'always',
shop = 'always',
tourism = 'always',
bridge = 'named_with_key',
tunnel = 'named_with_key',
waterway = 'named',
place = 'always'
}
flex.set_prefilters{delete_keys = {'note', 'note:*', 'source', '*source', 'attribution',
'comment', 'fixme', 'FIXME', 'created_by', 'NHD:*',
'nhd:*', 'gnis:*', 'geobase:*', 'KSJ2:*', 'yh:*',
'osak:*', 'naptan:*', 'CLC:*', 'import', 'it:fvg:*',
'type', 'lacounty:*', 'ref:ruian:*', 'building:ruian:type',
'ref:linz:*', 'is_in:postcode'},
delete_tags = {emergency = {'yes', 'no', 'fire_hydrant'},
historic = {'yes', 'no'},
military = {'yes', 'no'},
natural = {'yes', 'no', 'coastline'},
highway = {'no', 'turning_circle', 'mini_roundabout',
'noexit', 'crossing', 'give_way', 'stop'},
railway = {'level_crossing', 'no', 'rail'},
man_made = {'survey_point', 'cutline'},
aerialway = {'pylon', 'no'},
aeroway = {'no'},
amenity = {'no'},
club = {'no'},
craft = {'no'},
leisure = {'no'},
office = {'no'},
mountain_pass = {'no'},
shop = {'no'},
tourism = {'yes', 'no'},
bridge = {'no'},
tunnel = {'no'},
waterway = {'riverbank'},
building = {'no'},
boundary = {'place'}},
extratag_keys = {'*:prefix', '*:suffix', 'name:prefix:*', 'name:suffix:*',
'name:etymology', 'name:signed', 'name:botanical',
'wikidata', '*:wikidata',
'*:wikipedia', 'brand:wikipedia:*',
'addr:street:name', 'addr:street:type'}
}
flex.set_name_tags{main = {'name', 'name:*',
'int_name', 'int_name:*',
'nat_name', 'nat_name:*',
'reg_name', 'reg_name:*',
'loc_name', 'loc_name:*',
'old_name', 'old_name:*',
'alt_name', 'alt_name:*', 'alt_name_*',
'official_name', 'official_name:*',
'place_name', 'place_name:*',
'short_name', 'short_name:*', 'brand'},
extra = {'ref', 'int_ref', 'nat_ref', 'reg_ref',
'loc_ref', 'old_ref',
'iata', 'icao', 'pcode', 'pcode:*', 'ISO3166-2'},
house = {'addr:housename'}
}
flex.set_address_tags{main = {'addr:housenumber',
'addr:conscriptionnumber',
'addr:streetnumber'},
extra = {'addr:*', 'is_in:*', 'tiger:county'},
postcode = {'postal_code', 'postcode', 'addr:postcode',
'tiger:zip_left', 'tiger:zip_right'},
country = {'country_code', 'ISO3166-1',
'addr:country_code', 'is_in:country_code',
'addr:country', 'is_in:country'},
interpolation = {'addr:interpolation'}
}
flex.set_unused_handling{delete_keys = {'tiger:*'}}

View File

@@ -1,113 +0,0 @@
flex = require('flex-base')
flex.set_main_tags{
building = 'fallback',
emergency = 'always',
healthcare = 'fallback',
historic = 'always',
military = 'always',
natural = 'named',
landuse = 'named',
highway = {'always',
street_lamp = 'named',
traffic_signals = 'named',
service = 'named',
cycleway = 'named',
path = 'named',
footway = 'named',
steps = 'named',
bridleway = 'named',
track = 'named',
motorway_link = 'named',
trunk_link = 'named',
primary_link = 'named',
secondary_link = 'named',
tertiary_link = 'named'},
railway = 'named',
man_made = 'always',
aerialway = 'always',
boundary = {'named',
postal_code = 'always'},
aeroway = 'always',
amenity = 'always',
club = 'always',
craft = 'always',
junction = 'fallback',
landuse = 'fallback',
leisure = 'always',
office = 'always',
mountain_pass = 'always',
shop = 'always',
tourism = 'always',
bridge = 'named_with_key',
tunnel = 'named_with_key',
waterway = 'named',
place = 'always'
}
flex.set_prefilters{delete_keys = {'note', 'note:*', 'source', '*source', 'attribution',
'comment', 'fixme', 'FIXME', 'created_by', 'NHD:*',
'nhd:*', 'gnis:*', 'geobase:*', 'KSJ2:*', 'yh:*',
'osak:*', 'naptan:*', 'CLC:*', 'import', 'it:fvg:*',
'type', 'lacounty:*', 'ref:ruian:*', 'building:ruian:type',
'ref:linz:*', 'is_in:postcode',
'*:prefix', '*:suffix', 'name:prefix:*', 'name:suffix:*',
'name:etymology', 'name:signed', 'name:botanical',
'*:wikidata', '*:wikipedia', 'brand:wikipedia:*',
'addr:street:name', 'addr:street:type'},
delete_tags = {emergency = {'yes', 'no', 'fire_hydrant'},
historic = {'yes', 'no'},
military = {'yes', 'no'},
natural = {'yes', 'no', 'coastline'},
highway = {'no', 'turning_circle', 'mini_roundabout',
'noexit', 'crossing', 'give_way', 'stop'},
railway = {'level_crossing', 'no', 'rail'},
man_made = {'survey_point', 'cutline'},
aerialway = {'pylon', 'no'},
aeroway = {'no'},
amenity = {'no'},
club = {'no'},
craft = {'no'},
leisure = {'no'},
office = {'no'},
mountain_pass = {'no'},
shop = {'no'},
tourism = {'yes', 'no'},
bridge = {'no'},
tunnel = {'no'},
waterway = {'riverbank'},
building = {'no'},
boundary = {'place'}},
extra_keys = {'wikidata', 'wikipedia', 'wikipedia:*'}
}
flex.set_name_tags{main = {'name', 'name:*',
'int_name', 'int_name:*',
'nat_name', 'nat_name:*',
'reg_name', 'reg_name:*',
'loc_name', 'loc_name:*',
'old_name', 'old_name:*',
'alt_name', 'alt_name:*', 'alt_name_*',
'official_name', 'official_name:*',
'place_name', 'place_name:*',
'short_name', 'short_name:*', 'brand'},
extra = {'ref', 'int_ref', 'nat_ref', 'reg_ref',
'loc_ref', 'old_ref',
'iata', 'icao', 'pcode', 'pcode:*', 'ISO3166-2'},
house = {'addr:housename'}
}
flex.set_address_tags{main = {'addr:housenumber',
'addr:conscriptionnumber',
'addr:streetnumber'},
extra = {'addr:*', 'is_in:*', 'tiger:county'},
postcode = {'postal_code', 'postcode', 'addr:postcode',
'tiger:zip_left', 'tiger:zip_right'},
country = {'country_code', 'ISO3166-1',
'addr:country_code', 'is_in:country_code',
'addr:country', 'is_in:country'},
interpolation = {'addr:interpolation'}
}
flex.set_unused_handling{extra_keys = {'place'}}

View File

@@ -1,67 +0,0 @@
flex = require('flex-base')
flex.set_main_tags{
highway = {'always',
street_lamp = 'named',
traffic_signals = 'named',
service = 'named',
cycleway = 'named',
path = 'named',
footway = 'named',
steps = 'named',
bridleway = 'named',
track = 'named',
motorway_link = 'named',
trunk_link = 'named',
primary_link = 'named',
secondary_link = 'named',
tertiary_link = 'named'},
boundary = {administrative = 'named',
postal_code = 'always'},
landuse = 'fallback',
place = 'always'
}
flex.set_prefilters{delete_keys = {'building', 'source',
'addr:housenumber', 'addr:street',
'source', '*source', 'type',
'is_in:postcode', '*:wikidata', '*:wikipedia',
'*:prefix', '*:suffix', 'name:prefix:*', 'name:suffix:*',
'name:etymology', 'name:signed', 'name:botanical',
'addr:street:name', 'addr:street:type'},
delete_tags = {highway = {'no', 'turning_circle', 'mini_roundabout',
'noexit', 'crossing', 'give_way', 'stop'},
landuse = {'cemetry', 'no'},
boundary = {'place'}},
extratag_keys = {'wikipedia', 'wikipedia:*', 'wikidata', 'capital', 'area'}
}
flex.set_name_tags{main = {'name', 'name:*',
'int_name', 'int_name:*',
'nat_name', 'nat_name:*',
'reg_name', 'reg_name:*',
'loc_name', 'loc_name:*',
'old_name', 'old_name:*',
'alt_name', 'alt_name:*', 'alt_name_*',
'official_name', 'official_name:*',
'place_name', 'place_name:*',
'short_name', 'short_name:*'},
extra = {'ref', 'int_ref', 'nat_ref', 'reg_ref',
'loc_ref', 'old_ref',
'iata', 'icao', 'pcode', 'pcode:*', 'ISO3166-2'}
}
flex.set_address_tags{main = {'addr:housenumber',
'addr:conscriptionnumber',
'addr:streetnumber'},
extra = {'addr:*', 'is_in:*', 'tiger:county'},
postcode = {'postal_code', 'postcode', 'addr:postcode',
'tiger:zip_left', 'tiger:zip_right'},
country = {'country_code', 'ISO3166-1',
'addr:country_code', 'is_in:country_code',
'addr:country', 'is_in:country'},
interpolation = {'addr:interpolation'},
postcode_fallback = false
}
flex.set_unused_handling{extra_keys = {'place'}}

View File

@@ -1,9 +1,12 @@
all: bdd php python
all: bdd php
no-test-db: bdd-no-test-db php
bdd:
cd bdd && behave -DREMOVE_TEMPLATE=1
icu:
cd bdd && behave -DREMOVE_TEMPLATE=1 -DTOKENIZER=icu
php:
cd php && phpunit ./
@@ -11,4 +14,4 @@ python:
pytest python
.PHONY: bdd php no-test-db python
.PHONY: bdd php no-test-db

View File

@@ -31,11 +31,3 @@ Feature: Places by osm_type and osm_id Tests
| jsonv2 |
| geojson |
| xml |
Scenario: Lookup of a linked place
When sending geocodejson lookup query for N1932181216
Then exactly 1 result is returned
And results contain
| name |
| Vaduz |

Some files were not shown because too many files have changed in this diff Show More