sakeproject/sake

Unable to build under mono

Closed this issue · 12 comments

When building under mono 2.11 on Ubuntu 12.04 the build process fails:

warn: Dynamic view compilation failed.
(0,0): error CS1703: An assembly with the same identity `mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' has already been imported. Consider removing one of the references
(0,0): error : mscorlib.dll (Location of the symbol related to previous error)
(0,0): error : /opt/mono-2.11/lib/mono/4.5/mscorlib.dll (Location of the symbol related to previous error)
(0,0): error CS1703: An assembly with the same identity `System, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' has already been imported. Consider removing one of the references
(0,0): error : /opt/mono-2.11/lib/mono/gac/System/4.0.0.0__b77a5c561934e089/System.dll (Location of the symbol related to previous error)
(0,0): error : System.dll (Location of the symbol related to previous error)
(0,0): error CS1703: An assembly with the same identity `System.Xml, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' has already been imported. Consider removing one of the references
(0,0): error : /opt/mono-2.11/lib/mono/gac/System.Xml/4.0.0.0__b77a5c561934e089/System.Xml.dll (Location of the symbol related to previous error)
(0,0): error : System.Xml.dll (Location of the symbol related to previous error)
(0,0): error CS1703: An assembly with the same identity `System.Core, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' has already been imported. Consider removing one of the references
(0,0): error : /opt/mono-2.11/lib/mono/gac/System.Core/4.0.0.0__b77a5c561934e089/System.Core.dll (Location of the symbol related to previous error)
(0,0): error : System.Core.dll (Location of the symbol related to previous error)

Stack trace:

verbose: Stack trace: 
  at Spark.Compiler.BatchCompiler.Compile (Boolean debug, System.String languageOrExtension, System.String[] sourceCode) [0x00000] in <filename unknown>:0 
  at Spark.Compiler.CSharp.CSharpViewCompiler.CompileView (IEnumerable`1 viewTemplates, IEnumerable`1 allResources) [0x00000] in <filename unknown>:0 
  at Spark.SparkViewEngine.CreateEntryInternal (Spark.SparkViewDescriptor descriptor, Boolean compile) [0x00000] in <filename unknown>:0 
  at Spark.SparkViewEngine.CreateEntry (Spark.SparkViewDescriptor descriptor) [0x00000] in <filename unknown>:0 
  at Spark.SparkViewEngine.CreateInstance (Spark.SparkViewDescriptor descriptor) [0x00000] in <filename unknown>:0 
  at Sake.Engine.Loader.DefaultLoader.Load (Sake.Engine.Options options) [0x00000] in <filename unknown>:0 
  at Sake.Engine.SakeEngine.Execute (Sake.Engine.Options options) [0x00000] in <filename unknown>:0 
  at Sake.Engine.SakeEngine.Execute (System.String[] args) [0x00000] in <filename unknown>:0 
  at Sake.Program.Main (System.String[] args) [0x00000] in <filename unknown>:0 

I get a similar result when calling "sh ./build" on my Windows machine. The "build.cmd" works fine, though. When building Katana it fails at a similar point.

I'm running into the same issue with Mono 3.0.1 on a Mac. Based on the Stack trace, this may be a Spark issue since Sake uses the Shade support that was added into Spark. Maybe @RobertTheGrey can chime in as well.

I doubt there's much I can chime in on that @loudej wouldn't already have thought of on this. On the surface it looks like it could be a Spark issue, but I doubt it. There are a fair few projects that I know of now running on linux using Spark now and if there was a detrimental issue stopping dynamic compilation in general on mono, we would have heard about it.

This seems more like the kind of problems we see when the wrong version of .Net is being targeted by the C# compiler Spark is (perhaps incorrectly) using and the version that the project is using. For example we had this same issue when Spark was hard coded to using v3.5 to compile and someone was trying to use it in a .Net 4.0 project back when 4.0 was new.

Is there 4.5 somewhere in the equation here? If so, then it could be that we're back to that same issue again, but I can't be sure without more information. Unfortunately I don't have too much time right now to setup a linux VM to test this on, but hopefully it give a clue where to look - perhaps debug from Spark source and see if it's the C# compiler that's throwing...

After posting I dug into this a little bit further and I think @RobertTheGrey is right. Mono 2.11.3 and Mono 3.0.* have the 4.5 runtime. Thanks for chiming in @RobertTheGrey, it helps where we should start looking.

In that case, it's fairly likely that is the problem on the Spark side. It should be an easy fix in the Spark source, in the compiler where a pretty horrible if statement decides which version to use - something I've been meaning to look at for ages and not made time for.

If you expand on that if statement to include consideration for v4.5, then I can take the pull req as an interim fix for this once you've tested that it works for you. After that, I'll hopefully find some time to fix this more permanently.

Are you talking about the BatchCompiler?

I think that's the one yes ... if memory serves.

Rob
Sent from my iPhone

On Sun, Nov 18, 2012 at 5:04 PM, Dale Ragan notifications@github.comwrote:

Are you talking about the BatchCompilerhttps://github.com/SparkViewEngine/spark/blob/master/src/Spark/Compiler/BatchCompiler.cs
?


Reply to this email directly or view it on GitHubhttps://github.com//issues/12#issuecomment-10488226.

I'm not sure what it is that I need to change. I don't really see an if statement determining version. I see a call to get the compiler version which just returns "v4.0". From there it's just added into the ProviderOptions and passed into the CSharpCodeProvider.

Also, it's worth noting that when we're running Sake under mono, we're specifying to use the 4.0 runtime.

Ah OK, looks like the "if statement" I was referring to was indeed fixed by a pull request a couple of years ago - for some reason I still had it on my mental TODO list. If you were debugging to the point that you could see "v4.0" come back from that call - are you able to debug further to the point where it actually compiles and throws?

That pretty much what I would have to do now - but it definitely still looks as if a stray reference (perhaps because of the GAC being globally available?) to another version of mscorlib is finding it's way in there somehow.

It may be worth trying to run the build under v4.5 and also seeing if the GetCompilerVersion() in BatchCompiler gets it right as 4.5 as well. It'd also be worth seeing if that works at all - there should be no reason I can think of off the top of my head that it shouldn't all compile just fine under 4.5

Sorry if this is going in circles, but without an environment, I can't really test any of this myself.

Okay, I debugged and it looks like mscorlib, System, System.Xml, and System.Core are already added to the domain when compiling. I put a simple if block to exclude them when looping over the CurrentDomain's loaded assemblies. I'm not sure if this is the expected behavior of mono, but Sake can now run under Mono 3.0 with the above change to Spark. Will need to investigate with the Mono team to see if this is the expected behavior and whether this bug is related to this one.

Well, I just updated my local copy of Mono to master and sure enough Sake and Spark run under it without any changes. Looks like this bug was related.

Ah good news! :)