How are people unit testing with Entity Framework 6, should you bother?

前端 未结 11 1959

I am just starting out with Unit testings and TDD in general. I have dabbled before but now I am determined to add it to my workflow and write better software.

I ask

相关标签:
11条回答
  • 2020-11-28 17:42

    This is a topic I'm very interested in. There are many purists who say that you shouldn't test technologies such as EF and NHibernate. They are right, they're already very stringently tested and as a previous answer stated it's often pointless to spend vast amounts of time testing what you don't own.

    However, you do own the database underneath! This is where this approach in my opinion breaks down, you don't need to test that EF/NH are doing their jobs correctly. You need to test that your mappings/implementations are working with your database. In my opinion this is one of the most important parts of a system you can test.

    Strictly speaking however we're moving out of the domain of unit testing and into integration testing but the principles remain the same.

    The first thing you need to do is to be able to mock your DAL so your BLL can be tested independently of EF and SQL. These are your unit tests. Next you need to design your Integration Tests to prove your DAL, in my opinion these are every bit as important.

    There are a couple of things to consider:

    1. Your database needs to be in a known state with each test. Most systems use either a backup or create scripts for this.
    2. Each test must be repeatable
    3. Each test must be atomic

    There are two main approaches to setting up your database, the first is to run a UnitTest create DB script. This ensures that your unit test database will always be in the same state at the beginning of each test (you may either reset this or run each test in a transaction to ensure this).

    Your other option is what I do, run specific setups for each individual test. I believe this is the best approach for two main reasons:

    • Your database is simpler, you don't need an entire schema for each test
    • Each test is safer, if you change one value in your create script it doesn't invalidate dozens of other tests.

    Unfortunately your compromise here is speed. It takes time to run all these tests, to run all these setup/tear down scripts.

    One final point, it can be very hard work to write such a large amount of SQL to test your ORM. This is where I take a very nasty approach (the purists here will disagree with me). I use my ORM to create my test! Rather than having a separate script for every DAL test in my system I have a test setup phase which creates the objects, attaches them to the context and saves them. I then run my test.

    This is far from the ideal solution however in practice I find it's a LOT easier to manage (especially when you have several thousand tests), otherwise you're creating massive numbers of scripts. Practicality over purity.

    I will no doubt look back at this answer in a few years (months/days) and disagree with myself as my approaches have changed - however this is my current approach.

    To try and sum up everything I've said above this is my typical DB integration test:

    [Test]
    public void LoadUser()
    {
      this.RunTest(session => // the NH/EF session to attach the objects to
      {
        var user = new UserAccount("Mr", "Joe", "Bloggs");
        session.Save(user);
        return user.UserID;
      }, id => // the ID of the entity we need to load
      {
         var user = LoadMyUser(id); // load the entity
         Assert.AreEqual("Mr", user.Title); // test your properties
         Assert.AreEqual("Joe", user.Firstname);
         Assert.AreEqual("Bloggs", user.Lastname);
      }
    }
    

    The key thing to notice here is that the sessions of the two loops are completely independent. In your implementation of RunTest you must ensure that the context is committed and destroyed and your data can only come from your database for the second part.

    Edit 13/10/2014

    I did say that I'd probably revise this model over the upcoming months. While I largely stand by the approach I advocated above I've updated my testing mechanism slightly. I now tend to create the entities in in the TestSetup and TestTearDown.

    [SetUp]
    public void Setup()
    {
      this.SetupTest(session => // the NH/EF session to attach the objects to
      {
        var user = new UserAccount("Mr", "Joe", "Bloggs");
        session.Save(user);
        this.UserID =  user.UserID;
      });
    }
    
    [TearDown]
    public void TearDown()
    {
       this.TearDownDatabase();
    }
    

    Then test each property individually

    [Test]
    public void TestTitle()
    {
         var user = LoadMyUser(this.UserID); // load the entity
         Assert.AreEqual("Mr", user.Title);
    }
    
    [Test]
    public void TestFirstname()
    {
         var user = LoadMyUser(this.UserID);
         Assert.AreEqual("Joe", user.Firstname);
    }
    
    [Test]
    public void TestLastname()
    {
         var user = LoadMyUser(this.UserID);
         Assert.AreEqual("Bloggs", user.Lastname);
    }
    

    There are several reasons for this approach:

    • There are no additional database calls (one setup, one teardown)
    • The tests are far more granular, each test verifies one property
    • Setup/TearDown logic is removed from the Test methods themselves

    I feel this makes the test class simpler and the tests more granular (single asserts are good)

    Edit 5/3/2015

    Another revision on this approach. While class level setups are very helpful for tests such as loading properties they are less useful where the different setups are required. In this case setting up a new class for each case is overkill.

    To help with this I now tend to have two base classes SetupPerTest and SingleSetup. These two classes expose the framework as required.

    In the SingleSetup we have a very similar mechanism as described in my first edit. An example would be

    public TestProperties : SingleSetup
    {
      public int UserID {get;set;}
    
      public override DoSetup(ISession session)
      {
        var user = new User("Joe", "Bloggs");
        session.Save(user);
        this.UserID = user.UserID;
      }
    
      [Test]
      public void TestLastname()
      {
         var user = LoadMyUser(this.UserID); // load the entity
         Assert.AreEqual("Bloggs", user.Lastname);
      }
    
      [Test]
      public void TestFirstname()
      {
           var user = LoadMyUser(this.UserID);
           Assert.AreEqual("Joe", user.Firstname);
      }
    }
    

    However references which ensure that only the correct entites are loaded may use a SetupPerTest approach

    public TestProperties : SetupPerTest
    {
       [Test]
       public void EnsureCorrectReferenceIsLoaded()
       {
          int friendID = 0;
          this.RunTest(session =>
          {
             var user = CreateUserWithFriend();
             session.Save(user);
             friendID = user.Friends.Single().FriendID;
          } () =>
          {
             var user = GetUser();
             Assert.AreEqual(friendID, user.Friends.Single().FriendID);
          });
       }
       [Test]
       public void EnsureOnlyCorrectFriendsAreLoaded()
       {
          int userID = 0;
          this.RunTest(session =>
          {
             var user = CreateUserWithFriends(2);
             var user2 = CreateUserWithFriends(5);
             session.Save(user);
             session.Save(user2);
             userID = user.UserID;
          } () =>
          {
             var user = GetUser(userID);
             Assert.AreEqual(2, user.Friends.Count());
          });
       }
    }
    

    In summary both approaches work depending on what you are trying to test.

    0 讨论(0)
  • 2020-11-28 17:43

    There is Effort which is an in memory entity framework database provider. I've not actually tried it... Haa just spotted this was mentioned in the question!

    Alternatively you could switch to EntityFrameworkCore which has an in memory database provider built-in.

    https://blog.goyello.com/2016/07/14/save-time-mocking-use-your-real-entity-framework-dbcontext-in-unit-tests/

    https://github.com/tamasflamich/effort

    I used a factory to get a context, so i can create the context close to its use. This seems to work locally in visual studio but not on my TeamCity build server, not sure why yet.

    return new MyContext(@"Server=(localdb)\mssqllocaldb;Database=EFProviders.InMemory;Trusted_Connection=True;");
    
    0 讨论(0)
  • 2020-11-28 17:52

    I want to share an approach commented about and briefly discussed but show an actual example that I am currently using to help unit test EF-based services.

    First, I would love to use the in-memory provider from EF Core, but this is about EF 6. Furthermore, for other storage systems like RavenDB, I'd also be a proponent of testing via the in-memory database provider. Again--this is specifically to help test EF-based code without a lot of ceremony.

    Here are the goals I had when coming up with a pattern:

    • It must be simple for other developers on the team to understand
    • It must isolate the EF code at the barest possible level
    • It must not involve creating weird multi-responsibility interfaces (such as a "generic" or "typical" repository pattern)
    • It must be easy to configure and setup in a unit test

    I agree with previous statements that EF is still an implementation detail and it's okay to feel like you need to abstract it in order to do a "pure" unit test. I also agree that ideally, I would want to ensure the EF code itself works--but this involves a sandbox database, in-memory provider, etc. My approach solves both problems--you can safely unit test EF-dependent code and create integration tests to test your EF code specifically.

    The way I achieved this was through simply encapsulating EF code into dedicated Query and Command classes. The idea is simple: just wrap any EF code in a class and depend on an interface in the classes that would've originally used it. The main issue I needed to solve was to avoid adding numerous dependencies to classes and setting up a lot of code in my tests.

    This is where a useful, simple library comes in: Mediatr. It allows for simple in-process messaging and it does it by decoupling "requests" from the handlers that implement the code. This has an added benefit of decoupling the "what" from the "how". For example, by encapsulating the EF code into small chunks it allows you to replace the implementations with another provider or totally different mechanism, because all you are doing is sending a request to perform an action.

    Utilizing dependency injection (with or without a framework--your preference), we can easily mock the mediator and control the request/response mechanisms to enable unit testing EF code.

    First, let's say we have a service that has business logic we need to test:

    public class FeatureService {
    
      private readonly IMediator _mediator;
    
      public FeatureService(IMediator mediator) {
        _mediator = mediator;
      }
    
      public async Task ComplexBusinessLogic() {
        // retrieve relevant objects
    
        var results = await _mediator.Send(new GetRelevantDbObjectsQuery());
        // normally, this would have looked like...
        // var results = _myDbContext.DbObjects.Where(x => foo).ToList();
    
        // perform business logic
        // ...    
      }
    }
    

    Do you start to see the benefit of this approach? Not only are you explicitly encapsulating all EF-related code into descriptive classes, you are allowing extensibility by removing the implementation concern of "how" this request is handled--this class doesn't care if the relevant objects come from EF, MongoDB, or a text file.

    Now for the request and handler, via MediatR:

    public class GetRelevantDbObjectsQuery : IRequest<DbObject[]> {
      // no input needed for this particular request,
      // but you would simply add plain properties here if needed
    }
    
    public class GetRelevantDbObjectsEFQueryHandler : IRequestHandler<GetRelevantDbObjectsQuery, DbObject[]> {
      private readonly IDbContext _db;
    
      public GetRelevantDbObjectsEFQueryHandler(IDbContext db) {
        _db = db;
      }
    
      public DbObject[] Handle(GetRelevantDbObjectsQuery message) {
        return _db.DbObjects.Where(foo => bar).ToList();
      }
    }
    

    As you can see, the abstraction is simple and encapsulated. It's also absolutely testable because in an integration test, you could test this class individually--there are no business concerns mixed in here.

    So what does a unit test of our feature service look like? It's way simple. In this case, I'm using Moq to do mocking (use whatever makes you happy):

    [TestClass]
    public class FeatureServiceTests {
    
      // mock of Mediator to handle request/responses
      private Mock<IMediator> _mediator;
    
      // subject under test
      private FeatureService _sut;
    
      [TestInitialize]
      public void Setup() {
    
        // set up Mediator mock
        _mediator = new Mock<IMediator>(MockBehavior.Strict);
    
        // inject mock as dependency
        _sut = new FeatureService(_mediator.Object);
      }
    
      [TestCleanup]
      public void Teardown() {
    
        // ensure we have called or expected all calls to Mediator
        _mediator.VerifyAll();
      }
    
      [TestMethod]
      public void ComplexBusinessLogic_Does_What_I_Expect() {
        var dbObjects = new List<DbObject>() {
          // set up any test objects
          new DbObject() { }
        };
    
        // arrange
    
        // setup Mediator to return our fake objects when it receives a message to perform our query
        // in practice, I find it better to create an extension method that encapsulates this setup here
        _mediator.Setup(x => x.Send(It.IsAny<GetRelevantDbObjectsQuery>(), default(CancellationToken)).ReturnsAsync(dbObjects.ToArray()).Callback(
        (GetRelevantDbObjectsQuery message, CancellationToken token) => {
           // using Moq Callback functionality, you can make assertions
           // on expected request being passed in
           Assert.IsNotNull(message);
        });
    
        // act
        _sut.ComplexBusinessLogic();
    
        // assertions
      }
    
    }
    

    You can see all we need is a single setup and we don't even need to configure anything extra--it's a very simple unit test. Let's be clear: This is totally possible to do without something like Mediatr (you would simply implement an interface and mock it for tests, e.g. IGetRelevantDbObjectsQuery), but in practice for a large codebase with many features and queries/commands, I love the encapsulation and innate DI support Mediatr offers.

    If you're wondering how I organize these classes, it's pretty simple:

    - MyProject
      - Features
        - MyFeature
          - Queries
          - Commands
          - Services
          - DependencyConfig.cs (Ninject feature modules)
    

    Organizing by feature slices is beside the point, but this keeps all relevant/dependent code together and easily discoverable. Most importantly, I separate the Queries vs. Commands--following the Command/Query Separation principle.

    This meets all my criteria: it's low-ceremony, it's easy to understand, and there are extra hidden benefits. For example, how do you handle saving changes? Now you can simplify your Db Context by using a role interface (IUnitOfWork.SaveChangesAsync()) and mock calls to the single role interface or you could encapsulate committing/rolling back inside your RequestHandlers--however you prefer to do it is up to you, as long as it's maintainable. For example, I was tempted to create a single generic request/handler where you'd just pass an EF object and it would save/update/remove it--but you have to ask what your intention is and remember that if you wanted to swap out the handler with another storage provider/implementation, you should probably create explicit commands/queries that represent what you intend to do. More often than not, a single service or feature will need something specific--don't create generic stuff before you have a need for it.

    There are of course caveats to this pattern--you can go too far with a simple pub/sub mechanism. I've limited my implementation to only abstracting EF-related code, but adventurous developers could start using MediatR to go overboard and message-ize everything--something good code review practices and peer reviews should catch. That's a process issue, not an issue with MediatR, so just be cognizant of how you're using this pattern.

    You wanted a concrete example of how people are unit testing/mocking EF and this is an approach that's working successfully for us on our project--and the team is super happy with how easy it is to adopt. I hope this helps! As with all things in programming, there are multiple approaches and it all depends on what you want to achieve. I value simplicity, ease of use, maintainability, and discoverability--and this solution meets all those demands.

    0 讨论(0)
  • 2020-11-28 17:54

    I would not unit test code I don't own. What are you testing here, that the MSFT compiler works?

    That said, to make this code testable, you almost HAVE to make your data access layer separate from your business logic code. What I do is take all of my EF stuff and put it in a (or multiple) DAO or DAL class which also has a corresponding interface. Then I write my service which will have the DAO or DAL object injected in as a dependency (constructor injection preferably) referenced as the interface. Now the part that needs to be tested (your code) can easily be tested by mocking out the DAO interface and injecting that into your service instance inside your unit test.

    //this is testable just inject a mock of IProductDAO during unit testing
    public class ProductService : IProductService
    {
        private IProductDAO _productDAO;
    
        public ProductService(IProductDAO productDAO)
        {
            _productDAO = productDAO;
        }
    
        public List<Product> GetAllProducts()
        {
            return _productDAO.GetAll();
        }
    
        ...
    }
    

    I would consider live Data Access Layers to be part of integration testing, not unit testing. I have seen guys run verifications on how many trips to the database hibernate makes before, but they were on a project that involved billions of records in their datastore and those extra trips really mattered.

    0 讨论(0)
  • 2020-11-28 17:57

    I have fumbled around sometime to reach these considerations:

    1- If my application access the database, why the test should not? What if there is something wrong with data access? The tests must know it beforehand and alert myself about the problem.

    2- The Repository Pattern is somewhat hard and time consuming.

    So I came up with this approach, that I don't think is the best, but fulfilled my expectations:

    Use TransactionScope in the tests methods to avoid changes in the database.
    

    To do it it's necessary:

    1- Install the EntityFramework into the Test Project. 2- Put the connection string into the app.config file of Test Project. 3- Reference the dll System.Transactions in Test Project.

    The unique side effect is that identity seed will increment when trying to insert, even when the transaction is aborted. But since the tests are made against a development database, this should be no problem.

    Sample code:

    [TestClass]
    public class NameValueTest
    {
        [TestMethod]
        public void Edit()
        {
            NameValueController controller = new NameValueController();
    
            using(var ts = new TransactionScope()) {
                Assert.IsNotNull(controller.Edit(new Models.NameValue()
                {
                    NameValueId = 1,
                    name1 = "1",
                    name2 = "2",
                    name3 = "3",
                    name4 = "4"
                }));
    
                //no complete, automatically abort
                //ts.Complete();
            }
        }
    
        [TestMethod]
        public void Create()
        {
            NameValueController controller = new NameValueController();
    
            using (var ts = new TransactionScope())
            {
                Assert.IsNotNull(controller.Create(new Models.NameValue()
                {
                    name1 = "1",
                    name2 = "2",
                    name3 = "3",
                    name4 = "4"
                }));
    
                //no complete, automatically abort
                //ts.Complete();
            }
        }
    }
    
    0 讨论(0)
  • 2020-11-28 17:58

    Effort Experience Feedback here

    After a lot of reading I have been using Effort in my tests: during the tests the Context is built by a factory that returns a in memory version, which lets me test against a blank slate each time. Outside of the tests, the factory is resolved to one that returns the whole Context.

    However i have a feeling that testing against a full featured mock of the database tends to drag the tests down; you realize you have to take care of setting up a whole bunch of dependencies in order to test one part of the system. You also tend to drift towards organizing together tests that may not be related, just because there is only one huge object that handles everything. If you don't pay attention, you may find yourself doing integration testing instead of unit testing

    I would have prefered testing against something more abstract rather than a huge DBContext but i couldn't find the sweet spot between meaningful tests and bare-bone tests. Chalk it up to my inexperience.

    So i find Effort interesting; if you need to hit the ground running it is a good tool to quickly get started and get results. However i think that something a bit more elegant and abstract should be the next step and that is what I am going to investigate next. Favoriting this post to see where it goes next :)

    Edit to add: Effort do take some time to warm up, so you're looking at approx. 5 seconds at test start up. This may be a problem for you if you need your test suite to be very efficient.


    Edited for clarification:

    I used Effort to test a webservice app. Each message M that enters is routed to a IHandlerOf<M> via Windsor. Castle.Windsor resolves the IHandlerOf<M> which resovles the dependencies of the component. One of these dependencies is the DataContextFactory, which lets the handler ask for the factory

    In my tests I instantiate the IHandlerOf component directly, mock all the sub-components of the SUT and handles the Effort-wrapped DataContextFactory to the handler.

    It means that I don't unit test in a strict sense, since the DB is hit by my tests. However as I said above it let me hit the ground running and I could quickly test some points in the application

    0 讨论(0)
提交回复
热议问题