问题
A new project we began introduced a lot of new technologies we weren't so familiar with, and an architecture that we don't have a lot of practice in. In other words, the interfaces and interactions between service classes etc of what we're building are fairly volatile, even more so due to internal and customer feedback. Though I've always been frustrated by the ever-moving specification, I see this to some degree a necessary part of building something we've never built before - if we just stuck to the original design and scope, the end product would probably be a whole lot less innovative and useful than it's becoming.
I also introduced test-driven development (TDD), as the benefits are well-documented and conceptually I loved the idea. Two more new things to learn - NUnit and mocking - but seeing all those green circles made it all worthwhile.
Over time, however, those constant changes in design seemed to mean I was spending a whole lot more time changing my tests than I was on writing the code itself. For this reason alone, I've gone back to the old ways of testing - that is, not automated.
While I have no doubt that the application would be far more robust with hundreds of excellent unit tests, I've found the trade-off of time to launch the product to be mostly unacceptable. My question is, then - have any of you also found that TDD can be a hassle if you're prototyping something / building a beta version? Does TDD go much more naturally hand-in-hand with something where the specifications are more fixed, or where the developers have more experience in the language and technologies? Or have I done something fundamentally wrong?
Note that I'm not trying to criticise TDD here - just I'm not sure it's always the best fit for all situations.
回答1:
The short answer is that TDD is very valuable for beta versions, but may be less so for prototyping.
I think it is very important to distinguish between beta versions and prototyping.
A beta version is essentially a production version that is just still in development, so you should definitely use TDD in that scenario.
A prototype/proof of concept is something you build with the express intent of throwing it away once you've gotten the answers out of it that you wanted.
It's true that project managers will tend to push for the prototype to be used as a basis for production code, but it is very important to resist that. If you know that's not possible, treat the prototype code as you would your production code, because you know it is going to become your production code in the future - and that means you should use TDD with it as well.
When you are learning a new technology, most code samples etc. are not written with unit tests in mind, so it can be difficult to translate the new technology to the unit testing mindset. It most definitely feels like a lot of overhead.
In my experience, however, unit testing often really forces you to push the boundaries of the new technology that you are learning. Very often, you need to research and learn all the different hooks the new technology provides, because you need to be able to isolate the technology via DI or the like.
Instead of only following the beaten path, unit testing frequently forces you to learn the technology in much more depth, so what may feel like overhead is actually just a more in-depth prototype - one that is often more valuable, because it covers more ground.
Personally, I think unit testing a new technology is a great learning tool.
The symptoms you seem to experience regarding test maintainability is a bit orthogonal, I think. Your tests may be Overspecified, which is something that can happen just as well when working with known technologies (but I think it is probably easier to fall into this trap when you are also learning a new technology at the same time).
The book xUnit Test Patterns describes the Overspecified Test antipattern and provides a lot of guidance and patterns that can help you write more maintainable tests.
回答2:
I've found that thoroughly testing early results in lots of code thrown away and an empty feeling in the pit of your stomach.
Test what needs to be tested and not a line of code more. When you figure out how much that is, let me know.
回答3:
When prototyping, I would say it depends on the type of prototyping. In evolutionary prototyping, where the prototype evolves into the final application, I would utilize unit testing as early as possible. If you are using throw-away prototyping, I wouldn't bother with unit testing - the final application is going to be nothing like the prototype.
I'm not sure what you mean by "beta", since a beta is almost a finished product. As soon as you start working on code that is going to be shipped, or has a potential to be shipped, make sure everything is well tested.
Now, pure test-driven development might be extreme, but it is important to make sure that all shippable code is as tested as possible, at the unit, integration, and system level.
回答4:
have any of you also found that TDD can be a hassle if you're prototyping something / building a beta version?
I have.. Tons of times :)
Does TDD go much more naturally hand-in-hand with something where the specifications are more fixed, or where the developers have more experience in the language and technologies?
Not really. TDD works quite nice with changing requirements, but TDD is really for ensuring a stable and contract-driven design: Two things which prototypes doesn't really need that badly..
Or have I done something fundamentally wrong?
Doesn't look like it :) You've just seen that TDD consists of other things than golden trees..
回答5:
"Over time, however, those constant changes in design seemed to mean I was spending a whole lot more time changing my tests than I was on writing the code itself"
Good. You should spend a lot of time on testing. It's important, and it's how you demonstrate that your code is right. "Less code than test" is a good benchmark.
That means that you were writing effective tests that demonstrated your expectations for the underlying technology.
You may want to consider doing this.
Some tests are "essential" or "core" or "enduring" features of the application independent of any technology choices. Focus on these. The should never change.
Some tests confirm the technology or implementation choices. These change all the time. Perhaps you should segregate these so that the technology changes lead to focused changes here.
回答6:
- Being given a roadmap with a moving X is frustrating. TDD or no TDD.. 'having to spend majority of the time changing the tests instead of the code' indicates either that the specs were changed radically or you just over-mocked yourself a.k.a "fragile tests". I'd need more input from you to comment further.
- Spike/Prototyping means trying to build something as a proof of concept or validation a design. With this definition, I'd say that you don't need to TDD your proto because your primary goal is learning / reducing the unknowns. Once you've done that you should throw away your proto and build your production version with automated tests (use TDD now). You ship these to the customer not 'tested prototypes'
- However if manual testing has been working well for you, persist with it. I like to prove to myself and others that my code works at the push of a button - avoid human boredom of repetitive tasks and get thorough testing.
Shipping Prototypes will bite you sooner and harder than you ever imagine. Take it from me.
回答7:
Prototyping is meant to be the used for "Would this kind of thing work"-exploration.
So there is no need for Testing. BUT! Always throw your Prototype away and code from ground zero!
回答8:
In agile development there's the concept of a "spike" - a deep but narrow investigation of a solution or a technology. Once you're comfortable with how things should work you start over with a new implementation with a higher quailty level.
There's a pitfall with software labeled as "beta" - all of the sudden you end up with something not intended for production as a vital part of your application and you haven't got the time to redo it. This will most likely come back and bite you later on. A protoype should really be just a prototype - no more, no less.
回答9:
I don't have a prescription for you, but I do have a diagnosis:
if you ever end up in a debugger, you shoulda had more tests. Even in the very earliest prototypes, you want the code you write to work, right? If I don't do TDD, and I run in to a bug, it's hard to retrofit unit tests to find the bug. It's tempting to go to the debugger. So, I aim my TDD efforts to produce a good enough set of tests so that I don't need the debugger. Doing this well requires lots of practice.
If you want to refactor but don't because of risk, you shoulda had more tests. If I'm going to be working with some code for more than a couple hours, I want to refactor to keep my velocity high. If I hesitate to refactor, it's gonna hurt my productivity, so I do everything I can to make refactoring safe and easy. Again, a good selection of tests is exactly what I need.
回答10:
For pure prototyping, as people have said, not so useful. But often a prototype has elements of the application that will follow.
As you build it, what are the solid concepts that you expect to continue? What are the key domain models? It makes sense to build tests around these. Then, they provide a solid base on which to explore ideas. By having part of the code base solid and tested, it allows you to go further with the prototype than you could with no tests.
It's always a balance of picking the right things to test when. It sounds like from your description you are adopting quite a few new things at once-- sometimes this can be too much, and you need to retreat a little bit to optimize your progress.
回答11:
What I typically do for this prototyping code is to write it without tests (or not many), but do the work under my src/test/java or wherever I'm putting my test code. That way, I won't inadvertently put it into production. If there's something I've prototyped that I like, then I'll create something in src/main/java (my prod code) and TDD it, pulling over code from the prototype one piece at a time as I add tests.
回答12:
Over time, however, those constant changes in design seemed to mean I was spending a whole lot more time changing my tests than I was on writing the code itself.
I write (and run) automated tests of the system's I/O. The system's I/O depends on the functional requirements, and doesn't change when the system's implementation changes, so I can refactor the implementation without changing the automated tests: Should one test internal implementation, or only test public behaviour?
回答13:
When I get in contact with a new technology, I usually write a couple of tests which I use as the basis of the prototype. The main advantage here is that it allows me to get used to the technology in small, digestible parts and that I document my progress in the only valid form of documentation: code.
That way, I can test assumptions (what happens when I persist an empty string in a database? DB2 will return an empty string, Oracle will return NULL) and make sure they don't change while I work on the prototype.
Think of this work as many, tiny prototypes all rolled into a couple of unit tests.
Now the design of my product will change over time but these spike tests are still valid: The underlying framework doesn't change just because I now pursue a different solution. In fact, these tests will allow me to migrate to the next version of the technology since they will ring an alarm when one of the assumptions I've been using has changed. Otherwise, I'd have to stay with the current version or be very careful with any upgrade since I couldn't be sure what it would break.
回答14:
I would be cautious about dismissing TDD when prototyping. Primarily because prototypes tend to (but not always) evolve into the final product. If you can be sure that the prototypes are thrown out once they have established what ever it was they were started for, then you can be a bit more lax. If however it there is a chance that either the prototype will evolve into the final product, or parts of the code will be ported into the final product, then would endeavor to follow TDD principles.
Not only does this make your code more testable, but more importantly (in my mind) it encourages you to follow good software design principles.
回答15:
If it's actually a prototype and you are going to throw it away when you're done, then take whatever route gets you through that phase. But, ask yourself, how often do your prototypes actually get thrown away vs. how often do they creep into you're final product?
There was a good presentation at JavaOne on TDD. One of the really nice outcomes was that you tended to understand your actual requirements much, much better when you had to author tests to meet them. This is in addition to all of the refactoring and code quality benefits that came along with the approach.
来源:https://stackoverflow.com/questions/1271716/is-unit-testing-a-bad-idea-during-beta-prototyping