I recently designed a time series module where my time series is essentially a S
ID: 642214 • Letter: I
Question
I recently designed a time series module where my time series is essentially a SortedDictionnary<DateTime, double>.
Now I would like to create unit tests to make sure that this module is always working and producing the expected result.
A common operation is to compute the performance between the points in the time series.
So what I do is create a time series with, say, {1.0, 2.0, 4.0} (at some dates), and I expect the result to be {100%, 100%}.
The thing is, if I manually create a time series with the values {1.0, 1.0} and I check for equality (by comparing each point), the test would not pass, as there will always be inaccuracies when working with binary representations of real numbers.
Hence, I decided to create the following function:
private static bool isCloseEnough(double expected, double actual, double tolerance=0.002)
{
return squaredDifference(expected, actual) < Math.Pow(tolerance,2);
}
Is there another common way of dealing with such a case?
Explanation / Answer
I can think of two other ways to deal with this problem:
You can use Is.InRange:
Assert.That(result, Is.InRange(expected-tolerance, expected+tolerance));
You can use Math.Round:
Assert.That(Math.Round(result, sigDigits), Is.EqualTo(expected));
I think that both ways are more expressive than a dedicated function, because the reader can see precisely what's going on with your number before it gets compared to the expected value.