I have 6000 work items stored in TFS
我有6000个工作项存储在TFS中
- Firstly, I need to receive them all (well, this is fairly simple to do using WIQL or something else)
- Then, I need to filter out all work items I was not working for a particular date range. For example, I am looking whether there were any changes performed for each work item.
- Finally, I show work items on the web page (the fastest step).
首先,我需要全部接收它们(嗯,这很简单,使用WIQL或其他东西)
然后,我需要过滤掉我在特定日期范围内没有工作的所有工作项。例如,我正在查看是否对每个工作项执行了任何更改。
最后,我在网页上显示工作项目(最快的一步)。
However, the whole process takes about 300 seconds to complete. I assume this is because I need to analyze history of each work item. So, are there any tricks that could possibly improve the time?
但是,整个过程大约需要300秒才能完成。我认为这是因为我需要分析每个工作项的历史记录。那么,有什么技巧可以改善时间吗?
More details: I have a web application that needs to do all this stuff, I am using .Net, I am using Work Items Store cache (but it does not seem to provide much help) and I am free to use any tool to speed up the process.
更多细节:我有一个Web应用程序,需要做所有这些东西,我使用.Net,我使用工作项存储缓存(但它似乎没有提供太多的帮助),我可以使用任何工具加速这个过程。
2 个解决方案
#1
Absolute fastest way, SSIS package (or any DTL) that performs data transformations and holds the denormalized data you need for this application somewhere. Depending on your scenario, this package could run nightly, hourly, whatever frequency (within sane limits) you need. However, if you need real time views of the data (and everybody thinks they do but rarely actually does) this won't work. I'd look into caching the data and only grabbing and filtering items that have changed.
绝对最快的方式,SSIS包(或任何DTL)执行数据转换并保存此应用程序所需的非规范化数据。根据您的情况,此程序包可以每晚,每小时运行,无论您需要的频率(在理智的范围内)。但是,如果您需要实时数据视图(并且每个人都认为他们这样做,但实际上很少这样做),这将无效。我会研究缓存数据,只搜索和过滤已更改的项目。
It seems to me that the real bottleneck in this process is that you're grabbing all of the work items and then doing the filtering.
在我看来,这个过程中的真正瓶颈是你抓住所有的工作项,然后进行过滤。
#2
The fastest way would be to query the TfsWorkItemTracking database on the server directly via SQL. This is not recommended of cause, but i have done this for a similar web application and it works quite well and reasonable fast. The db structure is not too complicated.
最快的方法是直接通过SQL查询服务器上的TfsWorkItemTracking数据库。这不是推荐的原因,但我已经为类似的Web应用程序做了这个,它运行得非常好,合理快速。 db结构不是太复杂。
#1
Absolute fastest way, SSIS package (or any DTL) that performs data transformations and holds the denormalized data you need for this application somewhere. Depending on your scenario, this package could run nightly, hourly, whatever frequency (within sane limits) you need. However, if you need real time views of the data (and everybody thinks they do but rarely actually does) this won't work. I'd look into caching the data and only grabbing and filtering items that have changed.
绝对最快的方式,SSIS包(或任何DTL)执行数据转换并保存此应用程序所需的非规范化数据。根据您的情况,此程序包可以每晚,每小时运行,无论您需要的频率(在理智的范围内)。但是,如果您需要实时数据视图(并且每个人都认为他们这样做,但实际上很少这样做),这将无效。我会研究缓存数据,只搜索和过滤已更改的项目。
It seems to me that the real bottleneck in this process is that you're grabbing all of the work items and then doing the filtering.
在我看来,这个过程中的真正瓶颈是你抓住所有的工作项,然后进行过滤。
#2
The fastest way would be to query the TfsWorkItemTracking database on the server directly via SQL. This is not recommended of cause, but i have done this for a similar web application and it works quite well and reasonable fast. The db structure is not too complicated.
最快的方法是直接通过SQL查询服务器上的TfsWorkItemTracking数据库。这不是推荐的原因,但我已经为类似的Web应用程序做了这个,它运行得非常好,合理快速。 db结构不是太复杂。