Z
Z
zorgingyaringen2017-01-08 00:12:11
PHP
zorgingyaringen, 2017-01-08 00:12:11

How to do php processing in html pages when redirecting non-existent pages to index.html?

On the html pages of the site, the api.php file is loaded by inclusion,
while there is a nuance, all non-existent pages are redirected to index.html
On the server, pure nginx + php-fpm 7

server {

  listen 30.30.30.30:80;

  server_name domen.ru;

  error_log /var/log/nginx/domen.ru.error.log;

  root  /home/www-data/sites/domen.ru/;
  index index.php index.html index.htm;

  location ~ \.php$ {
    include /etc/nginx/fastcgi_params;
    fastcgi_pass unix:/var/run/domen.sock;
    fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
  }

  # Redirect 404 - index.html
  location / {
    try_files $uri $uri/ /index.html?q=$uri;
  }

}

Answer the question

In order to leave comments, you need to log in

5 answer(s)
W
Wexter, 2017-01-08
@zorgingyaringen

location ^~ /index.html {
include /etc/nginx/fastcgi_params;
fastcgi_pass unix:/var/run/domain.sock;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
}

A
Andrey Prozorov, 2017-06-22
@emerysh

why the DrawTableNames and DrawTableSites functions? where are they used? what goes into cycles for($i=0;$i < numrows;$i++)? maybe there are 1,000,000,000,000 lines?
next moment USING(item) item should be in both tables. check if it is the same type and size in both of the tables you are joining.
and yes tables of wild sizes are kept for an exceptionally long time. There is an option how to speed up this matter, but only slightly. pass data to json. JS on the user side eats it and line by line asynchronously forms a table of a billion lines. you can still add a delayed loading with LIMIT then everything will definitely work quickly.

O
Oleg, 2017-06-22
@politon

First, try to determine the general time to connect to the database

<?php
$start = microtime(true); 
$host = 'localhost'; // адрес сервера 
$database = 'price_sites';
$user = 'root'; 
$password = ''; 
$link = mysqli_connect($host, $user, $password, $database) 
    or die("Ошибка " . mysqli_error($link));

mysqli_close($link);
echo 'Скрипт выполнен за: '.(microtime(true) - $start).' сек.';

for example, I have 0.008999938964844 sec.

F
FanatPHP, 2017-06-22
@FanatPHP

> If the same request is made in phpmyadmin:
phpmyadmin is a poor rattle for lamers. using it to measure the speed of query execution is like watching the time on a standing clock. Twice a day it will show correctly...
Firstly, phpmyadmin substitutes limit for your request, and requests only 20 rows out of your 100 thousand. Try querying everything and see how long it takes. If it gets done at all.
Secondly, even for a request with a limit, 0.0002 is not enough. The result clearly comes from the Queri cache. Once executed, then the request was cached, and joyfully produces a stupid result.
Thirdly, as everyone here has already said, making a table with a hundred thousand rows is completely beyond the bounds of meaningfulness. Who will read this? Even if you do XLS.
In general, first change how long the WHOLE request takes, and then reduce the number of requested rows to reasonable values.

N
Nikita Tratorov, 2017-06-27
@NikitaTratorov

In the light of the above, if you do not change the logic and you need to get all the rows, for example, to output to CSV or Excel, do something like this:

//set_time_limit(120); // может понадобиться увеличить время выполнения
$db_host = 'localhost';
$db_user = 'root';
$db_password = '';
$db_name = 'price_sites';
  
$linkps = mysqli_connect($db_host, $db_user, $db_password, $db_name);

$limit = 1000;
$offset = 0;
// выполняем запрос с подсчетом общего числа результатов
$query = sprintf("SELECT SQL_CALC_FOUND_ROWS * FROM table1 LEFT JOIN table2  USING(item) LIMIT %d,%d", $offset,$limit);
$result = mysqli_query($linkps, $query);	

// получаем общее число результатов
$resultTotal = mysqli_query($linkps, "SELECT FOUND_ROWS() AS total");
$res = mysqli_fetch_assoc($resultTotal);
$numrows = $res['total'];

// идем в цикле по результатам
for ($offset = 0; $offset <= $numrows ; $offset += $limit) {
    // получаем строки из результата
    while ($row = mysqli_fetch_assoc($result)) {
        // Что-нибудь делаем с полученной строкой
        print_r($row);
    }
    //запрашиваем очередную порцию данных
    $query = sprintf("SELECT * FROM table1 LEFT JOIN table2  USING(item) LIMIT %d,%d", $offset,$limit);
    $result = mysqli_query($linkps, $query);	
}

I say right away that the code may not work, because I wrote it from memory.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question