How to debug and optimize your database queries?

Why your api slow responding from database? Why your database query slow down ? how you will find the root cause?

  Shubham Verma      October 28, 2022


How to debug and optimize your database queries?

How to debug and optimize your database queries?

In this article we will learn how we can debug the database queries and will find the reason Why our api slow responding from database? We will find the way to fix those database issue and will try to boost the db query speed.

Slowing your query performance could be the problem with databases and the database is simply not being used properly.


There are following ways and some best practices to optimize your database queries.


👉 Use ENUM over VARCHAR

When we can use ENUM, don't use VARCHAR. ENUM type columns are very fast, Internally they use TINYINT yet those columns can contain and display string values. If we have limited and fixed values that will contain only a few different kinds of values to store, we must use ENUM.

👉 Identify the connection factors

There could be some issues with connections, we need to identify the DB connections. Here is the area where we need to look around.
Total Number of Connections
Number of Connections by state
Connections waiting for a lock
Maximum Transaction Age
Checkpoint Interval

👉 Optimize Your Queries Using Query Cache

Query Cache is a caching mechanism provided by database engines. Nowadays most database engines provide the caching feature, We need to enable the Query Cache. Query Cache is one of the most effective techniques for improving query performance, the database engines completely handle this. The advantage of the Query Cache is when the same query is executed multiple times, the result is fetched from the cache, which is very fast and saves execution time.


👉 Use 'LIMIT 1' When Getting a Unique Row

When you are querying your tables to fetch just one unique record, you know that there will be only record with this query. In this scenario, using `LIMIT 1` to your query can increase performance. The advantage is when the database engine get the one record, the database engine will stop scanning for records further instead of going thru the whole table or index.

👉 Use Same Column Types for Joins

If your query contains more JOIN queries, make sure both the join columns should be indexed on each table, this will improve the performance and internally optimizes the join operation. Also, the columns that are joined, should be the same type.


👉 Do Not ORDER BY RAND()

You should not use the RAND(), because database engines need to perform RAND() function and this function take a lot of processing power and resources for each and every row in the table before sorting it and giving you just 1 row.

👉 Don't use SELECT *

We need to fetch only the required field and not use 'SELECT *'. The 'SELECT *' increases the time in the disk operations. The more field engine needs to fetch the slower your query will be. If your database server is separate from your web server then it will take longer network delays because of data transfer between the servers.

👉 Have an id Field

Try to have an Id filed that is the PRIMARY KEY, AUTO_INCREMENT, preferably UNSIGNED.

If you have a table that has a unique field, do not make that your primary key. VARCHAR fields as primary keys are slower.
By using ID, you will have a good database structure with their IDs internally.



👉 Query Execution Time

The Query Execution Time could be one of the reasons for the slow response. we need to measure it by analyzing the log queries periodically at the application level.


👉 Use Indexing

Always do the indexing on those columns in which you need to perform the searches. Indexes are not only for primary or unique keys. They help in fasting the query performance. Wrongly defined or non-existent indexes could be one of the major reasons for bad performance. We should use proper indexing and need to fix issues with indexing.



👉 Use 'EXPLAIN' in Your SELECT Queries

EXPLAIN describe (explains) how your queries are being used. You need to put 'EXPLAIN' before 'SELECT', By putting it before a SELECT, you can see whether indexes are being used properly, and what kind of join is being performed.
For example (Without Indexing):

Output:

After Indexing:

Output:



👉 Use NOT NULL If You Can

If it is possible, always use NOT NULL if you can unless you have a very specific reason to use a NULL value into your table for a column. NULL columns require additional space which can increase the complexity of your queries or comparison statements.


👉 Use PREPARE statements

The PREPARE statement prepares a SQL statement and assigns it a name, stmt_name , by which to refer to the statement later.
Ex:

We have a lot of benefits from using PREPARE statements for performance and security reasons. PREPARE statements are now enabled with query caching, which helps in the performance.


👉 Use Fixed-length or Static Tables

Fixed-length or Static Tables are faster so we should use Fixed-length or Static Tables. If every single column in a table is "fixed-length", then that table is also considered "fixed-length" or "static" table.
If you are using NOT fixed-length types like TEXT, BLOB, or VARCHAR, or if you have used only 1 of these types then this will be handled differently by the MySQL engine. Fixed-length or Static Tables are faster because they can quickly calculate their position of it. If the row size is not fixed, every time it needs to do a seek, it has to consult the primary key index.

👉 Split the Queries

Split the Big DELETE or INSERT Queries when you need to perform a BIG operation related to delete or insert. Because the Big delete and big insert can lock your tables and make your application to a halt.

👉 Make Smaller Columns

Smaller columns are faster, making smaller columns are more faster than bigger ones, they are more compact and helpful in performance. The smaller columns help in reducing the amount of disk transfer.

If your table has very few rows, then use MEDIUMINT, SMALLINT, or even in some cases TINYINT as the primary key, there is no reason to make the primary key an INT.
If you do not need the time component, use DATE instead of DATETIME.


👉 Get Suggestions with PROCEDURE ANALYSE()

If your database is MySQL, then you can take help from PROCEDURE ANALYSE(). PROCEDURE ANALYSE() will force MySQL to analyze the columns structures and the actual data to come up with certain suggestions according to your data set which is best suited to your table.

Here are some tools to debug the database queries


👉 Find and Kill Long Running Query/Connection

If there are some queries that are in the transaction executed for a long time and used the resources without any specific reasons and do not releases the resources, because of this other transactions do not get resources. This could be a reason for increases in the query response time. This situation can also cause 100% CPU utilization. To fix this situation, the possible solution is to find long-running queries and kill them.
For example. Get the list of the processes running for a long time using the below commands (PostgreSQL command):

Output:


Now kill the process using its BID, Below command kills the process.

Output:


👉 SQL Server Profiler (SQL Queries)

SQL Server Profiler is used to understand the query execution. SQL Server Profiler displays how SQL Server resolves queries behind the scenes. SQL Server Profiler allows understanding exactly what Transact-SQL statements or Multi-Dimensional Expressions are being submitted to the server and how the server accesses the database and how they are returning the results.
You can do the following using SQL Server Profiler:
Create a trace
Watch the trace results
Store the trace results
Start, stop, pause, and modify the trace results
Replay the trace results


👉 Use Performance Tuning

Performance tuning is the best way for database index management, optimizing queries, and optimizing SQL client code. Database performance tuning helps in reducing network traffic, disk I/O, and CPU time. Also helps in minimize the response time of your queries by making the best use of your system resources.

👉 Use the Client Statistics feature

Use the Client Statistics feature to debug database query performance. Client Statistics can be found in SQL Server Data Tools. We can use Client Statistics to analyze the Client Statistics such as network, and application profiles. Execution time statistics are used to determine whether your query has improved its performance. Execution time statistics can also help debug query performance issues.

Conclusion:

In this article, we learned the following things:
Use ENUM over VARCHAR
Identify the connection factors
Optimize Your Queries Using Query Cache
Use 'LIMIT 1' When Getting a Unique Row
Use Same Column Types for Joins
Do Not ORDER BY RAND()
Don't use SELECT *
Have an id Field
Query Execution Time
Use Indexing
Use 'EXPLAIN' in Your SELECT Queries
Use NOT NULL If You Can
Use PREPARE statements
Use Fixed-length or Static Tables
Split the Queries
Make Smaller Columns
Get Suggestions with PROCEDURE ANALYSE()
Some tools to debug the database queries
Find and Kill Long Running Query/Connection
SQL Server Profiler (SQL Queries)
Use Performance Tuning
Use the Client Statistics feature

Related Keywords:

How to debug and optimize your database queries?

Why your api slow responding from database?

Why your database query slow down ?

how you will find the root cause?

How to speed up sql queries?

Database performance tuning and query optimization



Support our IDKBlogs team


Welcome to IDKBlogs.

If our contents help you and you learned from this platform, please consider supporting what we do.

Thank you.




Strongly Recommended Books For You:

Reading books opens the door to allow in more lights, If we want to know a man of rare intellect, try to know what books he/she reads.

For India:

For Other Countries (United States, United Kingdom, Spain, Italy, Germany, France, Canada, etc)


Thank you

I appreciate you taking the time to read this article. The more that you read, the more things you will know. The more that you learn, the more places you'll go. If you’re interested in Node.js or JavaScript this link will help you a lot.

If you found this article is helpful, then please share this article's link to your friends to whom this is required, you can share this to your technical social media groups also. You can follow us on our social media page for more updates and latest article updates.
To read more about the technologies, Please subscribe us, You'll get the monthly newsletter having all the published article of the last month.



Funny Programming | Idkblogs.com | Shubham Verma

👈

Enjoyed this moment?