是否可以为此事创建查询?

时间:2021-08-23 16:10:53

Let's say I have an API that constantly gets updated. Basically I get it from a site and it's in json format. But instead of using the usual GET & POST method, I'd like to know if its possible to get it using postgresql.

假设我有一个不断更新的API。基本上我从一个网站得到它,它是以json格式。但是我不想使用通常的GET&POST方法,而是想知道是否可以使用postgresql来获取它。

What I'm trying to do is create a daily routine which will run a function using pgadmin to update the tables in the server based on the json data from the API. So it runs automatically daily without needing manual input.

我要做的是创建一个日常例程,它将使用pgadmin运行一个函数,根据API中的json数据更新服务器中的表。因此它每天自动运行,无需手动输入。

So is it possible to create a GET method or something similar in the form of a query? Or are there other methods to approach this kind or matter? Basically getting the json data using the API automatically daily.

那么有可能以查询的形式创建一个GET方法或类似的东西吗?或者还有其他方法可以解决这种问题吗?基本上每天使用API​​自动获取json数据。

1 个解决方案

#1


1  

Yep, it is possible, but it's not really the ideal use-case for SQL.

是的,这是可能的,但它并不是SQL的理想用例。

You could use this from within your Postgres query to make GET and POST requests to the site, and you will get JSON returned as a string which you could store in a JSON field. If you are unable to install extensions to your Postgres database, I don't believe there's anything built into Postgres to make HTTP requests.

您可以在Postgres查询中使用它来对站点发出GET和POST请求,并且您将JSON作为字符串返回,您可以将其存储在JSON字段中。如果您无法安装Postgres数据库的扩展,我不相信Postgres内置任何内容来发出HTTP请求。

You could use some JSON functions to parse the information out of the JSON payload, but they can't preform any complex logic by themselves, so you might create a pretty large query trying to parse large amounts of data.

您可以使用一些JSON函数来解析JSON有效负载中的信息,但是它们不能自己预先形成任何复杂的逻辑,因此您可能会创建一个非常大的查询来尝试解析大量数据。

Finally, to schedule the query on a daily basis, you could use pgAgent. This as well requires an additional installation since it isn't built into Postgres already.

最后,要每天安排查询,您可以使用pgAgent。这也需要额外的安装,因为它已经没有内置到Postgres中。

I say that it might not be a good idea, since you should probably keep a separation between a data store and application logic, since this isn't the kind of thing that Postgres was designed to do. It would be much easier to capture and manipulate the data using a simple script or application that connects to the Postgres database, as well as the site website supplying the data. There are languages that are much better suited for handling this sort of situation, which would be much more legible and easier to maintain than a massive SQL query.

我说它可能不是一个好主意,因为你应该在数据存储和应用程序逻辑之间保持分离,因为这不是Postgres设计的那种东西。使用连接到Postgres数据库的简单脚本或应用程序以及提供数据的站点网站来捕获和操作数据会容易得多。有些语言更适合处理这种情况,与大规模SQL查询相比,它更易读,更易于维护。

If you provide some more information about the specific scenario for why you're doing this, I could try to give you a more specific and insightful answer.

如果您提供有关您为何要这样做的具体方案的更多信息,我可以尝试为您提供更具体和深刻见解的答案。

#1


1  

Yep, it is possible, but it's not really the ideal use-case for SQL.

是的,这是可能的,但它并不是SQL的理想用例。

You could use this from within your Postgres query to make GET and POST requests to the site, and you will get JSON returned as a string which you could store in a JSON field. If you are unable to install extensions to your Postgres database, I don't believe there's anything built into Postgres to make HTTP requests.

您可以在Postgres查询中使用它来对站点发出GET和POST请求,并且您将JSON作为字符串返回,您可以将其存储在JSON字段中。如果您无法安装Postgres数据库的扩展,我不相信Postgres内置任何内容来发出HTTP请求。

You could use some JSON functions to parse the information out of the JSON payload, but they can't preform any complex logic by themselves, so you might create a pretty large query trying to parse large amounts of data.

您可以使用一些JSON函数来解析JSON有效负载中的信息,但是它们不能自己预先形成任何复杂的逻辑,因此您可能会创建一个非常大的查询来尝试解析大量数据。

Finally, to schedule the query on a daily basis, you could use pgAgent. This as well requires an additional installation since it isn't built into Postgres already.

最后,要每天安排查询,您可以使用pgAgent。这也需要额外的安装,因为它已经没有内置到Postgres中。

I say that it might not be a good idea, since you should probably keep a separation between a data store and application logic, since this isn't the kind of thing that Postgres was designed to do. It would be much easier to capture and manipulate the data using a simple script or application that connects to the Postgres database, as well as the site website supplying the data. There are languages that are much better suited for handling this sort of situation, which would be much more legible and easier to maintain than a massive SQL query.

我说它可能不是一个好主意,因为你应该在数据存储和应用程序逻辑之间保持分离,因为这不是Postgres设计的那种东西。使用连接到Postgres数据库的简单脚本或应用程序以及提供数据的站点网站来捕获和操作数据会容易得多。有些语言更适合处理这种情况,与大规模SQL查询相比,它更易读,更易于维护。

If you provide some more information about the specific scenario for why you're doing this, I could try to give you a more specific and insightful answer.

如果您提供有关您为何要这样做的具体方案的更多信息,我可以尝试为您提供更具体和深刻见解的答案。