Share this page to your:
Mastodon

I’ve been doing stress testing on my Business Process management (BPM) framework. What this involves is launching hundreds of processes into the system and watching to see that they all complete in the expected way. Because there are hundreds the watching part has to be done by a testing framework I’ve built. The test framework is multi-threaded and the BPM framework is multi-threaded so there are a lot of threads bouncing around in all kinds of hard-to-predict ways. Not actually unpredictable, but too hard to try. What I need to know is the outcome rather than exactly what happened.

Until something doesn’t have the right outcome, of course. Then I have to trawl through the logs and figure out where something went wrong and why. It can be something simple like the process wasn’t defined the way I thought, or it might be complex and the result of some interaction between threads.

It struck me that this is a tiny bit like what those clever people looking for the Higgs Boson have to do. In their case they accelerate protons to enormous energies and slam them together. Then they pore through the results to see what happened. Because each run generates terrabytes of data (mine generates less than a meg) they have to use really, really smart tools to sift through the results. When they find something it is usually the results of a second or third level of particle fragmentation and they have to work backwards to see what it was that was produced in the first place without being able to directly observe it.

So that is a lot harder than what I am doing, obviously. But the errors that crop up in my framework are also sometimes caused by something that happened two or three steps beforehand, and that means I have to work backwards as well.

Fortunately I have far less data to work with and there aren’t any undiscovered particles involved that may or may not be there.

Previous Post Next Post