在AJAX请求中花费大量时间的MySQL查询

时间:2020-12-02 00:15:48

The problem

这个问题

Every AJAX request containing any DB query is taking a lot more time than normal.

每一个包含任何DB查询的AJAX请求所花费的时间都比平常要长得多。

I haven't updated the codebase since a week, but all of a sudden all the DB queries done in an AJAX request is taking a lot of time. A thing to notice here is that if the query is written in a page and then the page is normally loaded like if you were to visit: www.example.com/mypage.php ,

我已经有一个星期没有更新代码基了,但是突然之间,在AJAX请求中执行的所有DB查询都花费了大量的时间。这里需要注意的是,如果查询写在一个页面中,然后页面通常会被加载,就像访问www.example.com/mypage.php一样,

mypage.php:

mypage.php:

<?php

   $query = $db_handler->prepare(
      "SELECT * FROM table_x LIMIT 5"
   );
   $query->execute();
   $fetch = $query->fetchAll(PDO::FETCH_ASSOC);

?>

The page loads up very quickly with all the result.

页面加载速度非常快,结果都是如此。

But if its done in an AJAX's response file it takes a lot of time(say 15secs) to load

但是如果它在AJAX的响应文件中完成,则需要花费大量时间(比如15秒)来加载

AJAX Code on client-side:

在客户端AJAX代码:

$.ajax
({
    url: 'server_files/ajaxtest.php',
    type: 'POST',
    dataType: 'JSON',
    data:
    {
        data: 'some data'
    },
    success: function(data)
    {
        if( data.success === true )
        {

        }
        else if( data.success === false )
        {

        }
    },
    error: function(e)
    {
        alert('Error');
    }
});

ajax_response.php:

ajax_response.php:

<?php

   header('Content-Type: application/json');

   if( isset($_POST['data']) )
   {
       $query = $db_handler->prepare(
          "SELECT * FROM table_x LIMIT 5"
       );
       $query->execute();
       $fetch = $query->fetchAll(PDO::FETCH_ASSOC);

       echo json_encode([
           'success'  => true,
           'response' => $fetch
       ]);
    }
?>

^ takes 15 sec to load ( A query with 5 row sets(LIMIT 5) is taking the same time as a query with 10 row sets(LIMIT 10). )

^需要15秒加载(查询5行集(限5)正在同时查询10行集(限10)。)

if the same file contains only this

如果同一文件只包含此内容

<?php

   header('Content-Type: application/json');

   if( isset($_POST['data']) )
   {    
       echo json_encode([
           'success'  => true
       ]);
   }
?>

^ takes 300-400ms to load

^ 300 - 400 ms加载

Obviously a query will increase the response time a little(1-3secs) but 15secs is too much.

显然,查询会稍微增加响应时间(1-3sec),但15sec就太多了。


What I've done

我所做的

1) I've contacted my hosting provider, but that didn't helped much.

我已经联系了我的主机服务商,但是没有多大帮助。

2) I've also installed mysqltuner and it shows this:

2)我也安装了mysqltuner,它显示如下:

-------- General Statistics --------------------------------------------------
[--] Skipped version check for MySQLTuner script
[OK] Currently running supported MySQL version 5.5.49-0ubuntu0.14.04.1-log
[OK] Operating on 64-bit architecture

-------- Storage Engine Statistics -------------------------------------------
[--] Status: +Archive -BDB -Federated +InnoDB -ISAM -NDBCluster
[--] Data in PERFORMANCE_SCHEMA tables: 0B (Tables: 17)
[--] Data in MyISAM tables: 27K (Tables: 13)
[--] Data in InnoDB tables: 6M (Tables: 21)
[!!] Total fragmented tables: 21

-------- Security Recommendations  -------------------------------------------
[!!] User 'rootAlbert@127.0.0.1' has no password set.
[!!] User 'rootAlbert@::1' has no password set.
[!!] User 'rootAlbert@lamp' has no password set.

-------- Performance Metrics -------------------------------------------------
[--] Up for: 11h 23m 42s (21K q [0.533 qps], 11K conn, TX: 6M, RX: 2M)
[--] Reads / Writes: 92% / 8%
[--] Total buffers: 432.0M global + 2.7M per thread (151 max threads)
[OK] Maximum possible memory usage: 837.8M (84% of installed RAM)
[OK] Slow queries: 2% (488/21K)
[OK] Highest usage of available connections: 3% (6/151)
[OK] Key buffer size / total MyISAM indexes: 16.0M/156.0K
[OK] Key buffer hit rate: 99.2% (133 cached / 1 reads)
[OK] Query cache efficiency: 61.9% (6K cached / 10K selects)
[OK] Query cache prunes per day: 0
[OK] Sorts requiring temporary tables: 0% (0 temp sorts / 113 sorts)
[!!] Temporary tables created on disk: 50% (421 on disk / 842 total)
[OK] Thread cache hit rate: 99% (6 created / 11K connections)
[OK] Table cache hit rate: 33% (75 open / 223 opened)
[OK] Open file limit used: 1% (76/6K)
[OK] Table locks acquired immediately: 100% (4K immediate / 4K locks)
[OK] InnoDB data size / buffer pool: 6.5M/128.0M

-------- Recommendations -----------------------------------------------------
General recommendations:
    Run OPTIMIZE TABLE to defragment tables for better performance
    MySQL started within last 24 hours - recommendations may be inaccurate
    When making adjustments, make tmp_table_size/max_heap_table_size equal
    Reduce your SELECT DISTINCT queries without LIMIT clauses
Variables to adjust:
    tmp_table_size (> 16M)
    max_heap_table_size (> 16M)

3) Searched a lot and updated my my.cnf file. This is my my.cnf file (this file looked a bit different at the time when the problem occurred)

3)搜索了很多,更新了my.cnf文件。这是我的。mycnf文件(这个文件在出现问题时看起来有点不同)

[client]
port        = 3306
socket      = /var/run/mysqld/mysqld.sock

[mysqld_safe]
socket      = /var/run/mysqld/mysqld.sock
nice        = 0

[mysqld]
local-infile=0
log=/var/log/mysql-logfile
skip_name_resolve

user        = mysql
pid-file    = /var/run/mysqld/mysqld.pid
socket = /var/run/mysqld/mysqld.sock
port = 3306
basedir     = /usr
datadir = /var/lib/mysql
tmpdir      = /tmp
lc-messages-dir = /usr/share/mysql
skip-external-locking

slow-query-log = 1 
slow-query-log-file = /var/log/mysql-slow.log 
long_query_time = 2 
log-queries-not-using-indexes 

key_buffer      = 16M
max_allowed_packet = 32M
thread_stack        = 192K
thread_cache_size       = 8

myisam-recover         = BACKUP

query_cache_type=1
query_cache_limit=2M
query_cache_size=256M

tmp_table_size=16M
max_heap_table_size=16M
table_cache=3084

log_error = /var/log/mysql/error.log

expire_logs_days    = 10
max_binlog_size         = 100M
big-tables

[mysqldump]
quick
quote-names
max_allowed_packet  = 16M

[mysql]

[isamchk]
key_buffer      = 16M

!includedir /etc/mysql/conf.d/

4) Optimized all the tables in the DB

4)优化DB中的所有表

5) I had also upgraded my server from 1GB memory and 1CPU, 2TB Transfer to 2GB memory and 2CPUS, 3TB Transfer

5)将服务器从1GB内存和1CPU升级为2GB内存和2cpu, 3TB传输为2GB内存和2cpu

I'm still not getting why is it happening and how to solve this.

我还不知道为什么会发生,怎么解决。

2 个解决方案

#1


7  

The problem was in the connection string. I was using my domain name (example.com) for connecting to the Database. So I changed it to my IP address and it resolved the problem.

问题出在连接字符串中。我使用我的域名(example.com)连接到数据库。所以我把它改成了IP地址,它解决了问题。

Thanks everyone for your help.

谢谢大家的帮助。

#2


0  

I think you must first figure out whether its query which is taking time or not. - Put the same query in phpmyadmin and run the query and check the time. If query takes time then try to remove limit and run or use indexing for it. - If query does not take time then check for any error or issue in network while fetching the data.Remove the query from the ajax for time being and send static data. Check what time it it takes time.

我认为您必须首先弄清楚它的查询是否需要时间。-将相同的查询放入phmypadmin并运行查询并检查时间。如果查询需要时间,那么尝试删除限制并运行或使用索引。-如果查询不需要时间,则在获取数据时检查网络中的任何错误或问题。暂时从ajax中删除查询并发送静态数据。检查一下什么时候花时间。

Hope this helps.

希望这个有帮助。

#1


7  

The problem was in the connection string. I was using my domain name (example.com) for connecting to the Database. So I changed it to my IP address and it resolved the problem.

问题出在连接字符串中。我使用我的域名(example.com)连接到数据库。所以我把它改成了IP地址,它解决了问题。

Thanks everyone for your help.

谢谢大家的帮助。

#2


0  

I think you must first figure out whether its query which is taking time or not. - Put the same query in phpmyadmin and run the query and check the time. If query takes time then try to remove limit and run or use indexing for it. - If query does not take time then check for any error or issue in network while fetching the data.Remove the query from the ajax for time being and send static data. Check what time it it takes time.

我认为您必须首先弄清楚它的查询是否需要时间。-将相同的查询放入phmypadmin并运行查询并检查时间。如果查询需要时间,那么尝试删除限制并运行或使用索引。-如果查询不需要时间,则在获取数据时检查网络中的任何错误或问题。暂时从ajax中删除查询并发送静态数据。检查一下什么时候花时间。

Hope this helps.

希望这个有帮助。