问题
I have a common ASP.NET web application that is shared by multiple sites. I used NuGet to package this common web application and distribute it over the multiple sites. Following this idea: Multiple ASP.NET web applications depending on a common base application.
By doing this, I end up with some issues. When updating for a new version, NuGet becomes really slow. This is because of the 800 content-files it contains. Somehow NuGet has to take around 1~2 seconds for each content-file it needs to uninstall, ending up with approximately 25 minutes for uninstall and 5 minutes for install . Especially with a TFS-binding. Looking at the source-code of NuGet I figured that the API of Visual Studio where NuGet is talking to is the bottleneck. Causing Visual Studio to peak around 100% of CPU-usage the whole process.
So I thought, if Visual Studio is that slow, maybe I can do it without it. To my disappointment the NuGet commandline (that runs without Visual Studio) only downloads a package and unzips it. It won't update the project-file due the fact that some Powershell-scripts could reference to the DTE-object... ( although I don't).
Now I'm wondering: what are my options?
- Doing some XML-magic on the project-file to add content-items? What are the drawbacks of doing this? Is there already a tool for this?
- Doing some magic with build scripts?
- Throw away the content-files out of the package and use another tool like Bower or something? How may this be integrated into the project? Because eventually I want to see the content-files I have.
- Not using NuGet at all but something else ...? OpenWrap? Horn? (seems to no longer be active) Or maybe no package manager at all?
Please help me find the best solution :)
.
One other thing that frustrates me is when performing a NuGet update, it does an uninstall followed by an install. Why, if the changes between versions could be minimal?
回答1:
One option is that instead of adding 800 content files, you can embed them as resources in your shared library.
Then use something like EmbeddedResourceVirtualPathProvider
https://github.com/mcintyre321/EmbeddedResourceVirtualPathProvider to serve them up.
That way NuGet is simply replacing the .dll instead of 800 individual files.
There are plenty of articles online about how to use VirtualPathProvider
.
EDIT: Questions about performance
The solution seems relatively simple, so I would just try it and see if the performance is acceptable. The nice thing is that you don't have to do anything special with paths in your web app.
Here are the steps you need to do:
- In your resource library, set the content files to embedded resource. You can do that quickly by opening up the .csproj file in a text editor and replacing
<Content
with<EmbeddedResource
. - Add a reference to your resource library to your web app.
- Add the
EmbeddedResourceVirtualPathProvider
NuGet package to your web app - Register the virtual path provider. Here's an example to register using
AppInitialize()
stored in theApp_Code
folder.
namespace TestWebProject.App_Code
{
public class RegisterVirtualPathProvider
{
public static void AppInitialize()
{
HostingEnvironment.RegisterVirtualPathProvider(new EmbeddedResourceVirtualPathProvider.Vpp()
{
{typeof (Marker).Assembly, @"..\TestResourceLibrary"},
});
}
}
}
来源:https://stackoverflow.com/questions/23345438/content-files-with-nuget-are-very-very-slow