I'm writing an app that makes a daily post as a user, and having benchmarked the PHP code that does this, it seems to take about two seconds per user. I'm dividing the work up in to chunks, and using multiple cron jobs to do each chunk. I'd like to scale to many thousands of users one day, but this kind of work load is just too much. It would take my server literally all day to post to each user one at a time using this method.
我正在编写一个以用户身份每日发布的应用程序,并对执行此操作的PHP代码进行基准测试,每个用户似乎需要大约两秒钟。我正在将工作划分为块,并使用多个cron作业来完成每个块。我希望有一天能够扩展到成千上万的用户,但这种工作负载太多了。我的服务器整天都需要使用这种方法一次一个地发布给每个用户。
How do people normally do this? I've seen other apps that do it. Is there some way of sending all these posts off at once using just one API call? Using individual API calls per user is crazy slow.
人们通常如何做到这一点?我见过其他应用程序。有没有办法只使用一个API调用立即发送所有这些帖子?每个用户使用单独的API调用非常慢。
Thanks.
2 个解决方案
#1
0
On one hand, this is entirely dependent on the API.
一方面,这完全取决于API。
However, you could use a multi-threaded or pseudo-parallel approach to this, such that your program sends, say, 100 HTTP POST requests at a time, rather than generating one request after another in series.
但是,您可以使用多线程或伪并行方法,例如,您的程序一次发送100个HTTP POST请求,而不是一个接一个地生成一个请求。
Since you're using PHP, multi-threading is out (I think) but this question is very similar to others. For example, see how these folks recommend curl_multi.
既然你正在使用PHP,多线程已经出局(我认为),但这个问题与其他问题非常相似。例如,看看这些人如何推荐curl_multi。
#2
0
You can use batch query to achieve what you need.
您可以使用批处理查询来实现所需。
The code for batch query is mentioned below. You can refer more about Facebook batch query at : http://25labs.com/tutorial-post-to-multiple-facebook-wall-or-timeline-in-one-go-using-graph-api-batch-request/
批量查询的代码如下所述。您可以在以下网址详细了解Facebook批量查询:http://25labs.com/tutorial-post-to-multiple-facebook-wall-or-timeline-in-one-go-using-graph-api-batch-request/
$body = array(
'message' => $_POST['message'],
'link' => $_POST['link'],
'picture' => $_POST['picture'],
'name' => $_POST['name'],
'caption' => $_POST['caption'],
'description' => $_POST['description'],
);
$batchPost[] = array(
'method' => 'POST',
'relative_url' => "/{ID1}/feed",
'body' => http_build_query($body) );
$batchPost[] = array(
'method' => 'POST',
'relative_url' => "/{ID2}/feed",
'body' => http_build_query($body) );
$batchPost[] = array(
'method' => 'POST',
'relative_url' => "/{ID3}/feed",
'body' => http_build_query($body) );
$multiPostResponse = $facebook->api('?batch='.urlencode(json_encode($batchPost)), 'POST');
#1
0
On one hand, this is entirely dependent on the API.
一方面,这完全取决于API。
However, you could use a multi-threaded or pseudo-parallel approach to this, such that your program sends, say, 100 HTTP POST requests at a time, rather than generating one request after another in series.
但是,您可以使用多线程或伪并行方法,例如,您的程序一次发送100个HTTP POST请求,而不是一个接一个地生成一个请求。
Since you're using PHP, multi-threading is out (I think) but this question is very similar to others. For example, see how these folks recommend curl_multi.
既然你正在使用PHP,多线程已经出局(我认为),但这个问题与其他问题非常相似。例如,看看这些人如何推荐curl_multi。
#2
0
You can use batch query to achieve what you need.
您可以使用批处理查询来实现所需。
The code for batch query is mentioned below. You can refer more about Facebook batch query at : http://25labs.com/tutorial-post-to-multiple-facebook-wall-or-timeline-in-one-go-using-graph-api-batch-request/
批量查询的代码如下所述。您可以在以下网址详细了解Facebook批量查询:http://25labs.com/tutorial-post-to-multiple-facebook-wall-or-timeline-in-one-go-using-graph-api-batch-request/
$body = array(
'message' => $_POST['message'],
'link' => $_POST['link'],
'picture' => $_POST['picture'],
'name' => $_POST['name'],
'caption' => $_POST['caption'],
'description' => $_POST['description'],
);
$batchPost[] = array(
'method' => 'POST',
'relative_url' => "/{ID1}/feed",
'body' => http_build_query($body) );
$batchPost[] = array(
'method' => 'POST',
'relative_url' => "/{ID2}/feed",
'body' => http_build_query($body) );
$batchPost[] = array(
'method' => 'POST',
'relative_url' => "/{ID3}/feed",
'body' => http_build_query($body) );
$multiPostResponse = $facebook->api('?batch='.urlencode(json_encode($batchPost)), 'POST');