如何处理大于2mb的post数据

时间:2021-04-09 01:51:52

I have json post data with below template

我有以下模板的json post数据

 {

    "themeId" : JSONString,
    "themeName" : JSONString,
    "tables" : [{
        "tableName" : JSONString,
        "records" : [{
            "recordVersion" : JSONString,
            "tableItems" : [] 
        }]
    }]

}

and on Java side I have REST API like this:

在Java方面,我有像这样的REST API:

@POST
@Path("/{themeId}")
@Consumes({MediaType.APPLICATION_JSON})
public Response postTheme( @PathParam("themeId") String themeId, ThemeDictionary dictionary) throws InterruptedException {
    //code to handle
}

It worked fine when post data is less than 2 MB but how to handle data size bigger than 2 MB.

当post数据小于2mb时,但是如何处理大于2mb的数据时,它工作得很好。

Questions

问题

1) Should I go with pagination.

我应该使用分页吗?

2) If I split json into half then each half won't be valid json. So, should I accept strings and concatnate on server side?

2)如果我把json分成两半,那么每一半都不是有效的json。那么,我应该在服务器端接受字符串和concatnate吗?

3) Are there any good examples to handle this scenario

有什么好的例子可以处理这种情况吗

4) Looking for an approach that can handle json data of size less than or greater than 2 MB

4)寻找一种处理小于或大于2mb大小的json数据的方法

5 个解决方案

#1


4  

Pagination will not solve you problem since you are sending data to the server, and not receiving.

分页不会解决您的问题,因为您正在向服务器发送数据,而没有接收数据。

What servlet container do you use? It looks like default tomcat POST limit size.

使用什么servlet容器?它看起来像默认的tomcat POST限制大小。

If you are using standalone tomcat you need to set parameter maxPostSize for your Connector: see here or (here)

如果您正在使用独立的tomcat,那么需要为连接器设置参数maxPostSize:请参阅这里或(这里)

#2


2  

2MB is rather small and I think the approach to upload the json file as multipart, then normally process the json file can handle the up to 50MB sized file. An example of handling file uploading can be found here.

2MB相当小,我认为上传json文件的方法是多部分的,通常处理json文件可以处理高达50MB大小的文件。一个处理文件上传的例子可以在这里找到。

For json files that is more than hundred of MBs, we have to find some way to process in streaming, or split the file into smaller files.

对于超过100个MBs的json文件,我们必须找到某种方法来处理流,或者将文件分割成更小的文件。

#3


1  

Pagination will be the good option but it will need manual intervention. Instead of that you can sent multiple Async request to fetch data (ie., fetch 1-200 records in one request and next request will fetch 200-400) like that but it is not recommended way since your server will get more request based on the number of records.

分页将是一个很好的选择,但它将需要人工干预。相反,您可以发送多个异步请求来获取数据。,在一个请求中获取1-200条记录,下一个请求将会达到200-400,但这是不推荐的方式,因为您的服务器将根据记录的数量获得更多的请求。

#4


1  

Json files are great for compression. You should think about it.

Json文件对于压缩非常有用。你应该考虑一下。

Yes you should go with pagination. But there will be some cons of it. Such as consistency.

是的,你应该使用分页。但也会有一些缺点。如一致性。

You should send them by not dividing into strings. I suggest you to send meaningful data. So pagination will be meaningful. If the one of the parts (blocks) of the message is missing, you should only re-send that part. Not the all parts.

你应该通过不分割字符串的方式发送它们。我建议你发送有意义的数据。所以分页是有意义的。如果消息的一个部分(块)丢失,您应该只重新发送该部分。不是所有部分。

"how can you eat a really big fish? - by slicing thin".

“你怎么能吃一条大鱼呢?”通过切片薄”。

Try to post smaller & meaningful parts. Otherwise your servers will need more computing time to process data, your clients need more memory to process.

尝试发布更小更有意义的部分。否则,服务器需要更多的计算时间来处理数据,客户端需要更多的内存来处理数据。

#5


1  

Is there any reason why you are not sending the data in one single request? Send the 50MB as one request. There is no limit on the size of data in JSON or HTTP post specification as discussed in the below SO questions

是否有任何理由不将数据发送到一个请求中?发送50MB作为一个请求。JSON或HTTP post规范中的数据大小没有限制,如下面的问题所讨论

Is there a limit on how much JSON can hold?

JSON的容量有限制吗?

Is Http POST limitless?

Http POST是无限的吗?

If you are worried about the performance of your server. One possible option is to split your json logically so that the action can be performed in smaller chunks.

如果您担心服务器的性能。一种可能的选择是逻辑地分割json,以便可以以更小的块执行操作。

For eg Consider your tables array has 200 items in it you can consider splitting the tables array into smaller chunks maybe say 50 /20 per requests.

例如,假设你的表数组中有200个条目,你可以考虑将表数组分成更小的块,比如每个请求50 /20。

{

    "totalPages":2,
    "themeId" : JSONString,
    "themeName" : JSONString,
    "tables" : [{
        //first 50 tables
        "tableName" : JSONString,
        "records" : [{
            "recordVersion" : JSONString,
            "tableItems" : [] 
        }]
    }]

}

Next request

下一个请求

{
    "totalPages":2,
    "themeId" : JSONString,
    "themeName" : JSONString,
    "tables" : [{
        //next 50 tables
        "tableName" : JSONString,
        "records" : [{
            "recordVersion" : JSONString,
            "tableItems" : [] 
        }]
    }]

}

If you do not need the complete data for the processing the request, you can perform the action on the data as it arrives. If not, add the tables array to some db/file/memory till the last page is received, and for the last request merge the json back together and process the request and send back the proper response. If its the second case there is not much performance improvement.

如果不需要完整的数据来处理请求,可以在数据到达时对其执行操作。如果没有,则将表数组添加到一些db/file/内存中,直到收到最后一个页面,对于最后一个请求,将json合并到一起,并处理请求并返回适当的响应。如果是第二种情况,就没有多少性能改善。

#1


4  

Pagination will not solve you problem since you are sending data to the server, and not receiving.

分页不会解决您的问题,因为您正在向服务器发送数据,而没有接收数据。

What servlet container do you use? It looks like default tomcat POST limit size.

使用什么servlet容器?它看起来像默认的tomcat POST限制大小。

If you are using standalone tomcat you need to set parameter maxPostSize for your Connector: see here or (here)

如果您正在使用独立的tomcat,那么需要为连接器设置参数maxPostSize:请参阅这里或(这里)

#2


2  

2MB is rather small and I think the approach to upload the json file as multipart, then normally process the json file can handle the up to 50MB sized file. An example of handling file uploading can be found here.

2MB相当小,我认为上传json文件的方法是多部分的,通常处理json文件可以处理高达50MB大小的文件。一个处理文件上传的例子可以在这里找到。

For json files that is more than hundred of MBs, we have to find some way to process in streaming, or split the file into smaller files.

对于超过100个MBs的json文件,我们必须找到某种方法来处理流,或者将文件分割成更小的文件。

#3


1  

Pagination will be the good option but it will need manual intervention. Instead of that you can sent multiple Async request to fetch data (ie., fetch 1-200 records in one request and next request will fetch 200-400) like that but it is not recommended way since your server will get more request based on the number of records.

分页将是一个很好的选择,但它将需要人工干预。相反,您可以发送多个异步请求来获取数据。,在一个请求中获取1-200条记录,下一个请求将会达到200-400,但这是不推荐的方式,因为您的服务器将根据记录的数量获得更多的请求。

#4


1  

Json files are great for compression. You should think about it.

Json文件对于压缩非常有用。你应该考虑一下。

Yes you should go with pagination. But there will be some cons of it. Such as consistency.

是的,你应该使用分页。但也会有一些缺点。如一致性。

You should send them by not dividing into strings. I suggest you to send meaningful data. So pagination will be meaningful. If the one of the parts (blocks) of the message is missing, you should only re-send that part. Not the all parts.

你应该通过不分割字符串的方式发送它们。我建议你发送有意义的数据。所以分页是有意义的。如果消息的一个部分(块)丢失,您应该只重新发送该部分。不是所有部分。

"how can you eat a really big fish? - by slicing thin".

“你怎么能吃一条大鱼呢?”通过切片薄”。

Try to post smaller & meaningful parts. Otherwise your servers will need more computing time to process data, your clients need more memory to process.

尝试发布更小更有意义的部分。否则,服务器需要更多的计算时间来处理数据,客户端需要更多的内存来处理数据。

#5


1  

Is there any reason why you are not sending the data in one single request? Send the 50MB as one request. There is no limit on the size of data in JSON or HTTP post specification as discussed in the below SO questions

是否有任何理由不将数据发送到一个请求中?发送50MB作为一个请求。JSON或HTTP post规范中的数据大小没有限制,如下面的问题所讨论

Is there a limit on how much JSON can hold?

JSON的容量有限制吗?

Is Http POST limitless?

Http POST是无限的吗?

If you are worried about the performance of your server. One possible option is to split your json logically so that the action can be performed in smaller chunks.

如果您担心服务器的性能。一种可能的选择是逻辑地分割json,以便可以以更小的块执行操作。

For eg Consider your tables array has 200 items in it you can consider splitting the tables array into smaller chunks maybe say 50 /20 per requests.

例如,假设你的表数组中有200个条目,你可以考虑将表数组分成更小的块,比如每个请求50 /20。

{

    "totalPages":2,
    "themeId" : JSONString,
    "themeName" : JSONString,
    "tables" : [{
        //first 50 tables
        "tableName" : JSONString,
        "records" : [{
            "recordVersion" : JSONString,
            "tableItems" : [] 
        }]
    }]

}

Next request

下一个请求

{
    "totalPages":2,
    "themeId" : JSONString,
    "themeName" : JSONString,
    "tables" : [{
        //next 50 tables
        "tableName" : JSONString,
        "records" : [{
            "recordVersion" : JSONString,
            "tableItems" : [] 
        }]
    }]

}

If you do not need the complete data for the processing the request, you can perform the action on the data as it arrives. If not, add the tables array to some db/file/memory till the last page is received, and for the last request merge the json back together and process the request and send back the proper response. If its the second case there is not much performance improvement.

如果不需要完整的数据来处理请求,可以在数据到达时对其执行操作。如果没有,则将表数组添加到一些db/file/内存中,直到收到最后一个页面,对于最后一个请求,将json合并到一起,并处理请求并返回适当的响应。如果是第二种情况,就没有多少性能改善。