logging on Azure

后端 未结 3 937
感动是毒
感动是毒 2020-12-28 21:11

How can we log on Azure withe the granularity & control equivalent to log4net? We use log4net in our web apps we run on IIS and that works very well for us. Is that the

相关标签:
3条回答
  • 2020-12-28 21:36

    The best way to log data for Azure VMs or Cloud Services is to use Log4Net to log to disk as well as log information from all your instances to an Azure storage account. The advantage is that you will get a more robust solution. If for some reason Azure Diagnostics breaks during a live site, you can still remote into any of the instances and try to diagnose the issue. For other services such as web apps, where you can't remote into the instances, it is sufficient to log information to Azure Storage accounts.

    1. Logging To Disk: If you have a cloud service, your web role/worker role will only have permission to write to the application disk which is quite small as shown here and here. If you are happy with logging not more than say 200-300 MB of disk space then you are good. However, if you would like to log more then it might be best to use the concept of LocalStorage which will allow you to reserve space on the C drive which has a huge amount of space (for Small PaaS VMs Drive C has 225 GB roughly while the application drive (E:/F:) has 1.5 GB of space). You can learn how to use local storage here.
    2. Logging to Azure Storage Account: You will need to active Azure Diagnostics as shown here. Furthermore, you will have to add a Trace Appender to Log4Net as shown here.

    It will take you half a day or a day to setup and test but I hope that answers your question.

    0 讨论(0)
  • 2020-12-28 21:42

    I think you should be careful when storing the log file locally in Azure as this is not garanteed to stick around. The VM used to store the website can be reimaged and the logs will be lost.

    A better solution is to use Azure diagnostics combined with log4net (would work the same for other logging mechanisms such as NLog). Process is sumarrized here:

    1. Set up local storage as a place on the role instance (virtual machine) where log files are written.

    2. Add a element to the diagnostics.wadcfg file to instruct Azure diagnostics to create and use a container in blob storage.

    3. Add a element within to instruct Azure diagnostics to monitor the logging folder within the LogStorage local resource location.

    This way the locally stored logs will be copied to the blob storage.

    Full story here: http://justazure.com/microsoft-azure-diagnostics-part-1-introduction/

    0 讨论(0)
  • 2020-12-28 21:55

    This is pretty straight forward. I use the following log4net configuration to dump a log file in the web application root folder (easily changed to a sub-folder):

    <log4net>
      <root>
        <level value="DEBUG" />
        <appender-ref ref="LogFileAppender" />
      </root>
      <appender name="LogFileAppender" type="log4net.Appender.RollingFileAppender" >
        <param name="File" value="my_web.log" />
        <param name="AppendToFile" value="true" />
        <rollingStyle value="Size" />
        <maxSizeRollBackups value="10" />
        <maximumFileSize value="10MB" />
        <staticLogFileName value="true" />
        <layout type="log4net.Layout.PatternLayout">
          <param name="ConversionPattern" value="%date{yyyy-dd-MM HH:mm:ss.fff} [%thread] %-5level %logger.%method [%property{NDC}] - %message%newline" />
        </layout>
      </appender>
    </log4net>
    

    I then inspect the log file when needed directly from Visual Studio (double clicking the file downloads it) Server Explorer:

    Azure Website via Server Explorer

    0 讨论(0)
提交回复
热议问题