Gary Tao:
I have 300 million Addresses in my DB and I want to use pg_trgm to fuzzy search the records. The final purpose is to implement a search function just like Google Map search.
When I used pg_trgm to search these addresses, it costed about 30s to get the results. I used the gist index to create trgm index cause there are many records matching the similarlity condition of 0.3 and I just need about 5 or 10 results.
Is there a good way to improve the performance or is there a good plan to do table partitioning? My PostgreSQL version is 9.3.
Here are my SQLs of creating index and search:
CREATE INDEX addresses_trgm_index ON addresses USING gist (address gist_trgm_ops);
SELECT address, similarity(address, '981 maun st') AS sml FROM addresses WHERE address % '981 maun st' ORDER BY sml DESC LIMIT 10;
Posted in S.E.F
via StackOverflow & StackExchange Atomic Web Robots
This Question have been answered
HERE