You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
At national Day of civic hacking 2017 a city of Virginia Beach is data scientist told me that average response time is a bad indicator due to situational factors. For example, in an emergency situation and officer might have to choose between logging the arrival time and responding.
The text was updated successfully, but these errors were encountered:
I'm just summing the values and dividing by number of values. Natasha Singh-Miller suggested a different method but I don't know what. We can ask her. We should filter outliers for sure.
I've just realized I've been using the wrong terms here. Current method is sum of values divided by number of values (average, mean). We need a better method that discards outliers such as extremely long response times and possibly instantaneous (0) response times.
Yeah that's what I thought you meant. I wasn't sure if you had a specific method in mind, or just something better than average.
kmcurry
changed the title
Change average response time to mean
Discard outliers in data, ex. response time of Null, 0, or very large number
Sep 24, 2018
At national Day of civic hacking 2017 a city of Virginia Beach is data scientist told me that average response time is a bad indicator due to situational factors. For example, in an emergency situation and officer might have to choose between logging the arrival time and responding.
The text was updated successfully, but these errors were encountered: