toddams/RazorLight

Feature Request: Generate an assembly and persist it to disk (Precompile views)

MarkKharitonov opened this issue · 21 comments

Hi,
My use case - I am calling a small exe tool wrapping RazorLight from Powershell.
I have just a few templates, but a lot of model objects.
In order to keep the Powershell code well organized I end up calling the tool for the same templates multiple times. As a result the performance is abysmal.

However, if I could persist the generated assembly, then there would be a way to reuse it across the different calls.

Unfortunately, I failed to find how to do it.

What am I missing?

P.S.
I asked the same question on SO - https://stackoverflow.com/questions/73081390/is-there-an-off-the-shelf-implementation-of-the-razorlights-icachingprovider-th

Can you discuss why this is better than creating a global tool to precompile the views, and load the precompiled views instead?

In order to keep the Powershell code well organized I end up calling the tool for the same templates multiple times. As a result the performance is abysmal.

Just wondering, if this might be an x-y problem - what does your PowerShell code look like that the performance is so abysmal? Can you quantify it? How many calls are you making, how long do the calls take, why do you think it is slow... any additional info probably helps discover what your performance issue really is.

@jzabroski - Sure.
I need to create 55 work items from 55 JSON objects. The work item fields do not correspond 1-1 to the JSON objects and so Razor templates are used to help here. For example, a JSON object may contain annotations. Each annotation may be mapped to a work item comment. But an annotation has 4 fields - how should they be rendered? Another example - the description field - what should go there? I, as an implemented of the process, do not want to hard code it. Hence there is a razor template for that.
So, the current process employs 4 razor templates, where the one for the comments produces from annotations will be called multiple times - as many times as there are annotations.
Here is the Powershell code for a single scan result. It is called in a loop for every fetched scan result.

    $VeracodeScanResult | ConvertTo-Json -Depth 99 -Compress | Out-File -Encoding utf8 "$TempDir\$VeracodeIssueId.json"

    $AssignedTo = $AllAssignedTo | Select-Object -First 1

    $Description = (
        & {
            $AllAssignedTo | Select-Object -Skip 1 | ForEach-Object {
                "<div><a href=`"#`" data-vss-mention=`"version:2.0,5b9e7f0d-baf6-4b18-82d3-8333f6df35ac`">@$_</a></div>"
            }
            if (Test-Path "$SecurityMetadataHome\WorkItemDescription.cshtml")
            {
                & $InvokeNativeCommand {
                    & "$VeracodeBuildHelperHome\VeracodeBuildHelper.exe" render -j "$TempDir\$VeracodeIssueId.json" -t "$SecurityMetadataHome\WorkItemDescription.cshtml"
                }
            }
        }
    ) -join "`r`n"

    $Comments = Get-Item "$SecurityMetadataHome\*.WorkItemComment*.cshtml" | ForEach-Object {
        $TemplateFilePath = $_.FullName
        if ($TemplateFilePath.EndsWith(".WorkItemComment-FromAnnotations.cshtml", 'OrdinalIgnoreCase'))
        {
            if ($VeracodeScanResult.annotations)
            {
                for ($i = $VeracodeScanResult.annotations.Count - 1; $i -ge 0; --$i)
                {
                    (
                        & $InvokeNativeCommand {
                            & "$VeracodeBuildHelperHome\VeracodeBuildHelper.exe" render -j "$TempDir\$VeracodeIssueId.json" -t $TemplateFilePath -q "annotations[$i]"
                        }
                    ) -join "`r`n"
                }
            }
        }
        else
        {
            (
                & $InvokeNativeCommand {
                    & "$VeracodeBuildHelperHome\VeracodeBuildHelper.exe" render -j "$TempDir\$VeracodeIssueId.json" -t $TemplateFilePath
                }
            ) -join "`r`n"
        }
    } | ForEach-Object { $_ }

    $Fields = & $InvokeNativeCommand {
        & "$VeracodeBuildHelperHome\VeracodeBuildHelper.exe" render -j "$TempDir\$VeracodeIssueId.json" -t "$SecurityMetadataHome\WorkItemFields.json"
    } | ConvertFrom-Json

    $WellKnownFields = @{
        Title      = $Fields.Title
        HyperLinks = $Fields.HyperLinks
        Tags       = $Fields.Tags
    }

    $OtherFields = @{
        'dayforce.AhaId'                 = $VeracodeIssueId
        'Microsoft.VSTS.Common.Priority' = 5 - $VeracodeScanResult.finding_details.severity
    }

    $Fields.PSObject.Properties | Where-Object {
        $null -eq $WellKnownFields[$_.Name]
    } | ForEach-Object {
        $OtherFields[$_.Name] = $_.Value
    }

    $w = New-WorkItem PBI @WellKnownFields `
        -AreaPath $WorkItemAreaPath `
        -TeamProject $WorkItemTeamProject `
        -AssignedTo $AssignedTo `
        -Description $Description `
        -Comments $Comments `
        -Attachments "$TempDir\$VeracodeIssueId.json" `
        -OtherFields $OtherFields

So the entire logic could be done in C#, but it raises the bar for those who need to maintain this code. I want to make Razor available in scripting.

Now the timings. I am attaching two file - one before the change and the other is after the change.
CreateWorkItemsAfter.txt
CreateWorkItemsBefore.txt

Now I recognize the Powershell code could be structured differently - it could prepare all the input up front and call the tool once, but that complicates the tool and make it less generic. Right now the tool has a very simple interface - "give me the template, give me the json and optionally give me a json query and I will render your template from the given json". The change I suggest would make it much faster in scripting and it does not seem to be costly or cause any regression, does it?

(Please, ignore the name of the tool - VeracodeBuildHelper. The rendering logic will be moved to its own Razor dedicated tool)

Can you discuss why this is better than creating a global tool to precompile the views, and load the precompiled views instead?

No, it is not better. I am constrained in time and so need to improvise. That is the only reason. If I knew the amount of work needed to bring the global tool into the production state, then maybe I could contribute to it and accelerate its release.

If I knew the amount of work needed to bring the global tool into the production state, then maybe I could contribute to it and accelerate its release.

Don't you effectively have everything you need in your code snippet above? https://stackoverflow.com/a/73103966/1040437

It's just rather than making it a caching provider, you call the compiler directly and cache the generated assembly at the very end.

I need to think about it.

I am looking at it, but I have a question about the unit tests - some of them are failing at master. Am I correct?

@MarkKharitonov Believe this is resolved now that I merged your PR. Thanks for this incredible contribution. Makes open source worth it and a reminder why I still do it.

Excellent. Thank you very much.

@jzabroski - I posted a PR that fixes the unit tests. Could you have a look?

Looking

Release created, it should flow up to nuget.org in about an hour.

Release Notes: https://github.com/toddams/RazorLight/releases/tag/v2.2.0

Publish to Nuget action in progress (may fail if API key is out of date, in which case I may need Ivan's help): https://github.com/toddams/RazorLight/actions/runs/3253347419

Update - I forgot to update the pack publish pipeline to include the new project. I'm going to bump the version number and just hide the 2.2.0 release on nuget.org when I get a chance.

Question - how does the precompile tool become available for usage? Ideally, it should be a dotnet tool, right? But I have not done anything special to enable it, not that I have ever done it before.

I realized that now. You had IsPackable false.

It should just be:

    <ToolCommandName>razorlight-precompile</ToolCommandName>
    <PackAsTool>true</PackAsTool>

I reached out to Ivan for help. The issue is our existing API key is tied to a finite list of packages rather than a generic glob pattern, so adding new packages won't work.

Is there a usage example of this somewhere yet? Thanks!

There are unit tests. For example, have a look at this precompile unit test -

public void PrecompileAndRender(string templateFilePath, string jsonQuery, string expected)
{
templateFilePath = "Samples/" + templateFilePath;
var commandLineArgs = new List<string>
{
"precompile",
"-t",
templateFilePath,
"-m",
"Samples/FindingsWithSourceCodeInfo.json"
};
if (jsonQuery != null)
{
commandLineArgs.AddRange(new[] { "-q", jsonQuery });
}
var actual = Helper.RunCommand(commandLineArgs.ToArray()).ToString();
Assert.AreEqual(expected, actual);
}
Does it make sense?

I reached out to Ivan for help. The issue is our existing API key is tied to a finite list of packages rather than a generic glob pattern, so adding new packages won't work.

@jzabroski - any update?

2.3.0 has been up on nuget.org the last week. Sorry for not replying. nuget.org also reserved RazorLightTeam to have the RazorLight prefix going forward. Ideally you should have permission to be added to this team too if you'd like given the outsized size of your contrib. Up to Ivan tho.

That is OK, I am fine with just being an outside contributor. Great news, thank you very much.