- This topic has 10 replies, 1 voice, and was last updated 2 months, 1 week ago by secretvrdev.
- November 7, 2020 at 6:37 am #1708tkeerParticipant
How we denormalized our table to get ride of complex and slow queries.
[https://medium.com/@taukeer/de-normalization-of-database-for-read-performance-220cd50ac827](https://medium.com/@taukeer/de-normalization-of-database-for-read-performance-220cd50ac827)November 7, 2020 at 6:37 am #1709colshrapnelGuest
> We all know how bad left joins are for the performance
Pardon my ignorance but I probably lived under the rock and never heard of that. Can you please explain in layman’s terms, what’s wrong with left joins? In my ignorance I am using them all over the place.November 7, 2020 at 6:37 am #1710phordijkGuest
I am confused here. You are showing 10 rows in your data table and say it became so slow you needed to denormalize. Are you sure you are querying your data correctly and have proper indices?
>Luckily, the app was built with laravel, and its orm (eloquent)
Well that *might* have been your problem on top of the above. Your ORM is slow or the way you work with it is.November 7, 2020 at 6:37 am #1711darkhorzGuest
Is it safe to assume that it’s not a few queries that are the culprit and that you have analyzed/tweaked the queries and optimized indices etc.?November 7, 2020 at 6:37 am #1712xgryphGuest
I’ve made good use of denormalisation, but these joins and this amount of data doesn’t look low it should even be close to needing such a heavy handed approach. I would optimise first something isn’t right.November 7, 2020 at 6:37 am #1713justaphpguyGuest
It’s not just left join.
From first hand I saw (Postgres) that joins on many rows were just not efficient enough and de-normalizing, i.e. duplicating the data on the primary table you need to avoid the join, was the way to go. It happens.November 7, 2020 at 6:37 am #1714przemo_liGuest
Where is query analyzer output. Explain or equivalent there of is great tool, and it can be used to validate join related performance insights.November 7, 2020 at 6:37 am #1715NigelRel3Guest
This sounds more like a problem of setting customer expectations than an application problem.
They want things faster now for short term gains. When they want to make changes to the application they find that they cost twice as much because the data is spread across the database.
Some features won’t work due to the data becoming inconsistent (the main point of normalisation).
Sounds like the start of a slope to bigger problems!November 7, 2020 at 6:37 am #1716darkhorzGuest
I have never used them myself, but does anyone have any experience using views for this? By the sound of it views addresses this problem.November 7, 2020 at 6:37 am #1717dialtoneeGuest
Yeah in our current system I am pulling through several tables to present user data. This makes sense.November 7, 2020 at 6:37 am #1718secretvrdevGuest
Bro do you know temp tables?
- You must be logged in to reply to this topic.