Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

SOLVED: PostgreSQL 9.3 pg_trgm search 300 million addresses

Gary Tao:

I have 300 million Addresses in my DB and I want to use pg_trgm to fuzzy search the records. The final purpose is to implement a search function just like Google Map search.

When I used pg_trgm to search these addresses, it costed about 30s to get the results. I used the gist index to create trgm index cause there are many records matching the similarlity condition of 0.3 and I just need about 5 or 10 results.

Is there a good way to improve the performance or is there a good plan to do table partitioning? My PostgreSQL version is 9.3.

Here are my SQLs of creating index and search:

CREATE INDEX addresses_trgm_index ON addresses USING gist (address gist_trgm_ops);

SELECT address, similarity(address, '981 maun st') AS sml FROM addresses WHERE address % '981 maun st' ORDER BY sml DESC LIMIT 10;



Posted in S.E.F
via StackOverflow & StackExchange Atomic Web Robots
This Question have been answered
HERE


This post first appeared on Stack Solved, please read the originial post: here

Share the post

SOLVED: PostgreSQL 9.3 pg_trgm search 300 million addresses

×

Subscribe to Stack Solved

Get updates delivered right to your inbox!

Thank you for your subscription

×