Hi there,
Welcome to the June 2022 issue of our pganalyze newsletter!
Today, we show a quick in-app walk-through of our pganalyze Index Advisor so you can see how you can use it in a day-to-day situation. We are also sharing 2 new open positions we are hiring for at pganalyze, a re-cap of our recent indexing webinar, our latest 5mins of Postgres episodes, and more.
Additionally, you can find details about the latest pganalyze collector release (0.44.0) which features a new capability to filter sensitive data in EXPLAIN plans, and has multiple integration improvements for AWS, GCP and Heroku. Upgrading is recommended.
Check out everything below!
- - -
pganalyze Index Advisor walk-through
One of the most common points of feedback from our post-webinar survey of our recent "How to reason about indexing your Postgres database" webinar was that people wanted to see more of the pganalyze Index Advisor in action. I created this 5min video so you can learn more about it:
pganalyze Index Advisor walk-through→
- - -
New EXPLAIN plan filter option in the pganalyze collector
pganalyze has provided filtering capabilities for log text itself for a while (see PII filtering settings), but historically the collection of query samples and EXPLAIN plans was a question of all or nothing. With the new 0.44.0 collector release we have introduced a new filter setting: "filter_query_sample = normalize".
This new filtering system runs sensitive parts of the EXPLAIN plan, such as the "Index Cond" or "Filter" fields, through our query normalization function, turning query text portions like "WHERE email = 'customer@corp.com'" into "WHERE email = $1", before sending the EXPLAIN plan to the pganalyze service. Upgrade your collector and give it a try - we are eager to hear your feedback about this new capability!
- - -
We are hiring for 2 open positions at pganalyze - join our team:
Want to work on complex problems and help companies optimize query performance and tune their database better? We are looking to strengthen our US-based fully distributed team with two additional positions. If you know someone who might be a good fit, we always appreciate referrals to interested candidates!
Open position: Postgres Solutions Engineer→
Open position: Backend Software Engineer→
- - -
Watch a recording of our "How to reason about indexing your Postgres database" webinar:
We provide a deep dive into how Postgres deterministically chooses which index to use for a specific query and walk through a methodology for analyzing queries & creating and optimizing indexes. We then walk you through how the new pganalyze Indexing Engine has been implemented based on these principles.
Webinar recording: How to reason about indexing your Postgres database→
- - -
Provide product feedback & help us design new pganalyze functionality
We are once again looking for your feedback on what you'd like to see in the pganalyze product. We have a number of features in the pipeline, including improvements to the pganalyze alerting system, and new advisors that help you tune your database.
Help us prioritize what matters most, and join us for design research sessions:
Fill out our July 2022 customer survey→
- - -
5mins of Postgres - episodes from June: We show how you can use the amcheck extension to verify if this bug affects you and how to resolve it by dropping your index and recreating it. (share this episode here)
We discuss backup improvements in Postgres 15. Specifically, we look into LZ4 and Zstandard compression, as well as the removal of the exclusive backup mode. (share this episode here)
E22: Reducing AWS Aurora I/O costs with table partitioning & understanding partition pruning→ We talk about using partitioning to reduce AWS Aurora I/O costs, pg_partman, and how partition pruning speeds up query performance.(share this episode here)
E23: Fuzzy text search & case-insensitive ICU collations in Postgres→ We're going through fuzzy text search in Postgres with LIKE/ILIKE, trigrams, levenshtein distances, as well as case-insensitive pattern matching. (share this episode here)
E24: Tuning the Postgres statistics target & understanding selectivity for spatial indexes→ We are talking about tuning the Postgres statistics target, why it is 100 and how that number was derived, and Postgres selectivity, specifically selectivity for spatial indexes. (share this episode here)
Subscribe to the pganalyze YouTube channel to stay up to date about our weekly videos→
- - -
Product changes in June:
- - -
As always, I'm happy to hear from you and am personally reading all of your responses. Looking forward to your thoughts!
Have a nice day! Lukas & the pganalyze team
PS: If you're interested in giving pganalyze a try you can learn more about it here. Of course, I'd be happy to walk you through a demo on a video call. You can book a 30 minute meeting with me here. Or, if you have general requests, feel free to get in touch with us here.
|