Traffic data that provides genuine intelligence: why quality is key

30 March, 2020

Share our post:

They say data is the new gold. But the difference is, these days, there’s too much of it. It’s no longer a case of whether you should use it to make decisions, but how to use it most effectively. 

And the answer to that lies in data quality. With such a wealth of information to choose from, businesses that get selective about their data sources and data science are best placed to make better decisions and thrive.

Traffic data in particular has been around for a long time. At Kalibrate, our records go back to 1963. That’s great longitudinal information to have, but that alone doesn’t make quality data. What sets it apart is its accuracy. From then, to this day, we use actual recorded counts to ensure the information fueling our own models, and any that our customers might use, is as accurate as possible.

Here, we examine some critical aspects for ensuring traffic intelligence with integrity.

The quest for quality inputs

Your decision making is only as good as the data it’s based on. To generate high quality traffic intelligence, modeling should be based on actual recorded counts, rather than estimates. This ensures you’re working with information that is as close to the truth as possible. Ask suppliers where they source their data from, how often it’s updated, which format it’s provided in, and whether you can see a sample data set up front. Some agencies can’t afford complete traffic studies, so some roads are prioritized above others, and published counts age and become less applicable. No matter what data you use or how complex the model, untreated data can have serious implications on the accuracy of your decisions, so make sure you know how your provider treats their data.

A single source of truth

Combining traffic data with other data sets, such as demographic data, can offer you much deeper insight. This means taking the high quality data set you’ve sourced and integrating it with other data sets. Combining, rationalizing, and standardizing datasets is critical to create a single source of truth that you can rely on, and draw comparisons and conclusions from. That means translating file types, formats, rows, columns, and curve balls into standardized databases that offer real value.

Taking the training opportunity

The application of technologies like AI and machine learning are incredibly exciting, and open up a host of data science opportunities. These technologies are widely used already, and enable efficiencies in the gathering, processing, and reading of traffic intelligence. However, it’s important that they’re trained with due consideration and fuelled by clean, standardized data. 

Accounting for anomalies 

Raw data will always come with some anomalies. If roads are quiet due to freak weather, basic anomaly detection can extract that outlier. In extreme circumstances, like those we find ourselves in now, human expertise will need to be applied to datasets to provide context and aid decision making.

Big data abounds, and traffic data is no exception. Having been recorded for almost 60 years, there’s plenty of it available. But not all data is created equal. Anyone looking to glean business intelligence and validate investment decisions needs pinpoint accuracy in their understanding of their trade area, and that includes traffic counts. That means ensuring accuracy from even the most granular level of data. 

At Kalibrate, we have a vast library of raw information waiting to be explored, examined, and put to use in a number of ways, within a number of industries. TrafficMetrix®️ creates confidence in decision making. Explore sample data for your area to see how you could use it.

Topics: Traffic Data

Add a Comment