Costing (Software Engineering stuff .. *meh*)



  • Backstory: I attend a university that is not very far from getting the honors being called WTF-U. Proof:

    • In first year, I almost failed a C programming project because I used 'goto' to break out of two nested loops
    • This year the lecturer for the Java course refused to introduce 'finally' and 'for (String itm : m_StrCollection) { }' because it was 'too confusing'

    With that said, I'm currently writing a preliminary report for a major (self-chosen) project I have to complete throughout third year. It's a Tablet PC-based "e-learning" application that'll teach geography using handwriting, touch and gestures.

    The report wants me to (among other things) make an estimation of the complexity of the system using the number of lines of code and man-hours. This was supposedly taught in Software Engineering last year.

    Since I figured something like this was bound to happen, I saved all the slides from last year to find... that this was never covered. Therefore, I figured I would apply a more or less obvious approach of pretending that I was a consultant and multiplying the number of days MS Project says I will be spending on my project with a fictional amount of money. But I don't think that's what they want-- or is it?

    A bit of searching on the web revealed this nugget: http://en.wikipedia.org/wiki/Cost_estimation_in_software_engineering

    But which of those methods do I use? COCOMO seems to be the most fitting since it uses man-hours and lines of code, but ... how do I calculate the number of lines of code for a project that hasn't been created yet? I don't have any historical data to use either since none of my previous work (like an AIML chat bot in C) is in related in any way. :/

    So, how can I estimate the cost of a project without having created it, given only a MS Project-created Gantt chart, an estimated number of days to complete it, a decent grasp of its complexity, and no idea of the number of lines of code?

    I still have three weeks to finish the report, so this isn't the typical "do my homework for me" question. I'm very willing to figure it out by myself, but I need some pointers in the right direction to get started. Anyone willing to provide some?



  • I don't know about university but in the real world lines of code cost nothing, but man hours cost money.  There are of course people that do analysis on existing code bases  to create some sort of fictional guess of how many man hours it must have cost to build something based on the Loc.  For instance see this thing that was on slashdot a while back.  http://www.dwheeler.com/sloc/

    Personally I would just ask them to clarify what they actually want.

    Also if you actually have some codeyou can run various tools to find out the complexity. googl CRAP index, cyclomatic complexity and perhaps continous integration because CI tools mostly run all those metric stuff.



  • @stratos said:

    Personally I would just ask them to clarify what they actually want.
     

    +1

    Also, how long it takes to program depends so much on the programmer, his/her experience with the particular environment, the language, the size of his screen/RAM/hard drive, that using someone else's figures is nearly useless.

    In my experience, it helps to drill down as deeply as you can and divide what you're trying to make into logical blocks that you more or less understand, in that you know from experience how long they will take. Personally, in the project you suggest, you don't give too much detail, but I see plenty complexity. In fact, with my students I would probably try to convince them to lose some of the complexity unless they'd have quite a bit experience, and lots of time (say 3 months).



  • Asking for clarification it is!

    That is probably indeed the best approach. I have to meet up with my mentor on Tuesday, so I'll bring this issue up. Now I also have some arguments to throw at my mentor in case I'm told something like "just follow the instructions"-- although that probably won't happen.

    RE: Complexity: I have a lot of time to work on this (July '10 -> April '11) and am pretty good with C#. I can offload some of the work to MS' Tablet PC API, so complexities like handwriting recognition are dealt with at a basic level. I obviously can't offload everything onto external components since my contributions have to be significant, but the way I planned it out on paper should make it work out fine. The reason I'm so vague about the project is because I don't want to take the risk of WTF-U's lecturers identifying me. Yeah, I guess I'm paranoid... Anyway, it's planned out much further than my first post makes it sound.

    Thanks for all the pointers, stratos and b_redeker. I'll report back once I've found out what it is exactly that they want me to write about.



  • @garyniger said:

    I have a lot of time to work on this (July '10 -> April '11)
     

    We've always found that the amount of time to do a piece of work expands to fill the time available.

    For you that means the cost will be...

    Number of weeks * Number of hours per week spent on the assignment * pretty good C# programmer's hourly rate

    Where
       Number of weeks = 56
       Number of hours per week spent on the assignment = You'd know this better than I would.  Say, 10?
       Pretty good C# programmer's hourly rate = Local recruiters may be able to answer this


Log in to reply