Dev Post #13 – 11/28/2016 to 12/2/2016

Finalizing the New Damages Billing Endpoint

Toward the beginning of this week I had most of this endpoint logic well in hand with the exception of a few key pieces of information. First, I had yet to figure out how to reconcile the new items we would be receiving from the tablet with the old items that were already existing in the database –  my thought was that I wanted to try and find a way to reuse the old information if at all possible. And second, I hadn’t yet figured out how to determine cost for each damage definition because the process was a little more nuanced than I had previously thought – each definition has it’s own cost, but it also may have a cost associated with a specific building on campus (which is admittedly pretty clever because it allows for items to deprecate in value in different buildings depending on how old they are).

I got to work creating a web service helper that would handle these two problems. In retrospect I should have written tests around each function before actually implementing them, but I was seriously still learning how everything worked at the time and was more concerned with figuring out what part of the application I even needed to add stuff to…not really a good excuse but it’s the truth.

Regardless, I started with reconciling the old and new items in the damages database. After talking with my boss I realized the solution was much easier than I was making it out to be. I was envisioning having to create some fancy algorithm to determine when a new item corresponded to an old item, but in reality it just a matter of checking if an active item with the new item’s name (or whatever item name we received from the tablet) existed in the database, and if it didn’t create it. What’s more, apparently there is some kind of validation in the application that forces an item to be updated before someone can be billed for it if the cost is set to $0. We give each new item a bunch of default values (including $0 for both damage severity costs) so it will trigger this validation and force whoever is confirming the items to be billed to update them to proper values. Ideally this process only has to happen the first time a new item is added to the database so that every subsequent time it is used the definition will just work (by virtue of having been fixed with correct values).

After finishing this I wrote a simple set of tests to verify functionality. One verifying that trying to retrieve an definition for an item that did not exist did, in fact, create a new definition for that item. And then another verifying that trying to retrieve a definition for an existing item actually retrieved that definition. With this I was satisfied enough to move on to the next problem – determining a definition’s correct cost.

This problem was admittedly much more straightforward than the previous one. It was really a simple matter of trying to find a hall definition for a damage definition’s ID, using the hall definition cost if it existed or using the damage definition’s cost if it didn’t. One important note is that we have decided to always use the minor severity cost when submitting damages from the tablet, erring on on the side of charging students less and leaving the decision to charge more up to whoever is reviewing the damages before sending out the invoice.

As before I wrote some tests once I finished writing all the code. One to verify it gets the correct cost for a damage definition with a hall definition, and one to verify it gets the correct cost for a damage definition without a hall definition.

As I was writing tests I explored two different routes to do the testing – the initial one involved using fake repositories to mock saving and loading things from the database. After going down this path a bit I realized that I actually wanted to test whether things were really being saved in the database, so I pivoted and built out some new (real) repositories to test with. I actually ended the week with some failing tests and data not saving into the database correctly, but I’m positive it’s just something small that I’ll be able to fix early next week.

Revisiting Report Filtering

Hello old friend. I’m not sure if I’ve mentioned this before, but this is the same card that got backlogged several months ago…and now Josh has burned through enough of our cards to get back to it – it’s honestly pretty nuts how much work he’s gotten done in such a small amount of time. Luckily I was stubborn enough to refuse to delete the branch (even though we usually delete branches that just a few weeks old at max). We were able to reuse a lot of the existing work, which is good because we spent at least a few solid weeks of effort on the report filters stuff the first time around.

We started with the Vacancy Report and gradually moved to the Early Arrival Registration report, adding generic filtering options as we needed them. Once we touched Early Arrival we actually ended up breaking the Arrival and Early Arrival Billing reports as well, so there was a good amount of work that had to be done until everything was functional again. One hitch we ran into was when we were trying to making everything generic and completely reusable – Josh and I went in circles for about an hour trying out several different solutions, but ended up having to create two similar, but different table filtering objects to solve the problem. While the solution certainly isn’t perfect, we minimized the amount of duplication by extracting out the actual filtering logic (which doesn’t change between filtering objects) into a helper.

We ended up with generic report filters that worked just as well as I imagined they would. Of course we wrote a bunch of exhaustive tests around each set of filters to make sure all the different variations would work – it was a lot of work up front but we’re all extremely confident that the filtering is working exactly as expected.

Advertisements