EtchUK/Etch.OrchardCore.SEO

Robots.Txt - Disallow all pages mode - workflow causes problems with http requests.

Opened this issue · 0 comments

Description:

When the workflow has an HTTP endpoint and the Robots.txt disallow all pages mode is activated, the workflow instance does not work properly. From the error message, I think this is caused by NoIndexFilter.cs.

Error log:

System.InvalidOperationException: Headers are read-only, response has already started.
at Microsoft.AspNetCore.Server.Kestrel.Core.Internal.Http.HttpHeaders.ThrowHeadersReadOnlyException()
at Microsoft.AspNetCore.Server.Kestrel.Core.Internal.Http.HttpHeaders.Microsoft.AspNetCore.Http.IHeaderDictionary.set_Item(String key, StringValues value)
at Etch.OrchardCore.SEO.RobotsTxt.Filters.NoIndexFilter.OnResultExecutionAsync(ResultExecutingContext context, ResultExecutionDelegate next)
at Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker.g__Awaited|30_0[TFilter,TFilterAsync](ResourceInvoker invoker, Task lastTask, State next, Scope scope, Object state, Boolean isCompleted)
at Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker.Rethrow(ResultExecutedContextSealed context)
at Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker.ResultNext[TFilter,TFilterAsync](State& next, Scope& scope, Object& state, Boolean& isCompleted)
at Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker.InvokeResultFilters()

If I can spare time, I plan to open a PR.