Unit testing - where do I start?



  • @boomzilla said:

    It depends on your application, don't it?

    Not in my experience. I have always found that items can be decomposed and then organized so that unit tests on simple data are achievable. Of course I have not done everything, but 35+ years seems a fairly large sample period.

    ps: I did not mean to imply that you were introducing the conflation of unit and integration testing; merely that in discussing data requirements for unit testing, the topic of integration testing is not directly relevant.


  • ♿ (Parody)

    @TheCPUWizard said:

    I have always found that items can be decomposed and then organized so that unit tests on simple data are achievable.

    I'm not saying it isn't achievable. I'm saying it's not worth it to me vs what I get from the time spent building integration tests.

    Though there are some parts of my system that are good candidates for unit testing and indeed have legitimate unit tests.



  • @TheCPUWizard said:

    Not in my experience. I have always found that items can be decomposed and then organized so that unit tests on simple data are achievable. Of course I have not done everything, but 35+ years seems a fairly large sample period.

    Oh come on.

    This is lame bragging, TheCPUWizard. You can do better than this. Tell us how expensive your car is.

    Man, what happened to you. It's like you woke up one morning and suddenly weren't an egotistical asshole anymore. I miss the old CPUWizard. (TheOldCPUWizard.)



  • @blakeyrat said:

    This is lame braggin

    I don't see how it is "bragging" at all. And I am surprised at the attitude given the recent compliments I paid you...

    That being said, if you or anyone can produce an example of something that has "complex data" requirements to test a single unit of executable code, where that unit of code can not be decomposed/refactored to be testable with simpler data, I would appreciate it being posted...

    ps: One advantage of being a Wizard is that while many years may pass, I am not Old 😜



  • @TheCPUWizard said:

    I don't see how it is "bragging" at all.

    That's what makes it lame! You used to be the bragging expert, man.

    @TheCPUWizard said:

    That being said, if you or anyone can produce an example of something that has "complex data" requirements to test a single unit of executable code, where that unit of code can not be decomposed/refactored to be testable with simpler data, I would appreciate it being posted...

    You're using a different definition of "unit test" than the organization I'm working for uses.

    We only call them "integration tests" if they're talking to a system OUTSIDE our company. Anything inside (even stuff dealing with database schema) is called a "unit test".

    This is like 90% of the confusion in this thread; the terminology isn't nailed-down.



  • The problem is, his definition is the right one. Your company's is wrong.



  • I don't care.

    I'm just explaining the difficulty.



  • What's even the difference, from a purely practical and non-formal standpoint, between unit tests involving mock databases and a localdb test instance? You can reasonably assume your DB engine works as fine as the mock library. I see no advantage at all.



  • It depends on how much time doing that adds to the test run. Unit tests that aren't fast are worthless, because they'll be run less frequently.



  • Fair enough. Though a small DB shouldn't add that much overhead - most of it is initialization, and unless you do a complete DB rebuild per single test, that should only run once, perhaps a few times per suite.



  • The end result of all this is of course NCrunch, which, if you manage to somehow get it working, runs your tests at the most useful possible frequency. Even Mighty Moose, though that thing is likely to give you entertaining icons in the margin, such as 'Beware, all ye who enter here' when it finds code with no tests.

    Because those clearly have some value, Microsoft has decided to move codelens down to VS Pro, which means that all tests that have to do with a method are available from the method itself.



  • @Magus said:

    NCrunch

    Yeah, that's a thing. You will want to optimize your tests for that, I agree.



  • From what I can tell, it prioritizes tests, so that isn't as much of a problem for it as it could be, but if running tests is a pain, it won't happen often enough. The more frequently you get feedback, the more useful.



  • Speed and Isolation are the key reasons.

    1. strive to be able to do at least 1000 unit tests in under 1 second. Even one roundtrip to the DB is likely to eat or exceed the 1mS timeframe.

    2. WHEN a test fails, you want 99.99% certainty that it is the code that is the target of the test which has the problem, and not something outside of that scope.



  • @TheCPUWizard said:

    strive to be able to do at least 1000 unit tests in under 1 second. Even one roundtrip to the DB is likely to eat or exceed the 1mS timeframe.

    Are you assuming that your test database is off on some other box?



  • @tarunik said:

    Are you assuming that your test database is off on some other box?

    Actually I was not (though that would probably make things longer and less reliable (more dependencies). I was just doing a first order magnitude of actual times with SqlServer running local, starting a cold connection, selecting a "reasonable" row with a "where PK='value' clause.


Log in to reply