This project is read-only.


OutputCache for routes from Controllers that are set to not be cached are blank


I came across this after spending a few hours trying to debug why a page that worked fine in 1.6 was now coming out blank.

The root of the problem is that the CapturingResponseFilter is no longer writing to the sink that has been passed into it and the capture filter is still left attached to the Response which means at any point in the OnResultExecuted that it returns before the CapturingResponseFilter is output in my case at:
        if (configuration != null && configuration.Duration == 0) {
The capture filter is still left attached to the response and anything subsequently written to it gets lost and goes nowhere.

What works for me is to restore the original filter that was on the response from what had been passed into the capture filter like so:
        if (configuration != null && configuration.Duration == 0) {
            response.Filter = _filter.Sink;
Attached is my patched version (also including the patch from issue 20050)

file attachments

Closed Oct 25, 2013 at 9:04 PM by sebastienros


rdobson wrote Sep 4, 2013 at 2:22 PM

Missed adding the extra patch to OnActionExecuted
        if (!(filterContext.Result is ViewResultBase) && !(AuthorizedContentTypes.Contains(filterContext.HttpContext.Response.ContentType)))
            if (_filter != null)
                filterContext.HttpContext.Response.Filter = _filter.Sink;

            _filter = null;

2LM wrote Sep 5, 2013 at 10:47 PM

I have tested this and it seems to fix all issues I was having with Advanced Sitemap, custom 404 pages and robots.txt

2LM wrote Sep 5, 2013 at 10:49 PM

Although it doesn't seem to be as fast as without this fix...

rdobson wrote Sep 6, 2013 at 4:19 PM

404 page are not cached anyway (only the 200 response code), neither will robots.txt be cached (which is related to the 20050 patch), the 20050 patch seems to ensure only "text/html", "text/xml" and "text/json" content types are cached, robots.txt will be "text/plain", if you want robots.txt to be cached just add text/plain to the AuthorizedContentTypes array.

2LM wrote Sep 6, 2013 at 4:50 PM

Yes, I know, but without your fix my 404 page, robots.txt and sitemap.xml were simply empty

rdobson wrote Sep 6, 2013 at 6:52 PM

Yes they will be, I was talking about the slowness you mentioned, for those pages you will not see any speed benefit from caching as they would never be cached, this just fixes the bug (as detailed) where the response output is being captured into a memory stream that is never written out in particular circumstances.

2LM wrote Sep 6, 2013 at 7:26 PM

Ah ok, seems we are misunderstanding each other then. The difference in speed I was referring to was caching in general, with or without the fix.

rdobson wrote Sep 25, 2013 at 3:04 PM

Am not seeing any difference in speed myself with or without, and that is on a live environment with ~50 tenants currently running.

The only difference I am seeing is that caching in 1.7 is far faster than 1.6.

sebastienros wrote Oct 25, 2013 at 9:04 PM

CAn repro as of today on 1.7.x. and I worked on the module in the meantime. Can you try by yourself ?