Code Quality Insights for OpenCV Project

Dear OpenCV contributors,

I hope this message finds you well. At TIOBE, we specialize in software quality measurement and improvement, and we have recently set up a TiCS analysis for OpenCV.

The results are encouraging: the current TQI (TIOBE Quality Indicator) score is 70.83%, which corresponds to a low C rating. Even more interesting, we have included historical data going back one year, allowing us to track how the quality of the codebase evolves over time.

With TiCS, you can get:

  • Objective quality measurements, based on the ISO 25010 standard

  • Insights from deep flow analysis tools, beyond what traditional linters or SonarQube provide

  • A uniform dashboard for tracking trends and comparing projects consistently

We would be happy to share the detailed results with you and explore the possibilities together. If you are interested, I’d like to propose a short demonstration session where we can walk you through the analysis and discuss how TiCS could be valuable for Vue.js and its ecosystem.

To include more people who might be interested, we could extend the invite to the OpenCV core team or contributor community, or even host a session for contributors who want to learn more about software quality in their daily work. Would you perhaps know the best way to reach the broader OpenCV community?

Looking forward to your thoughts!

Best regards,
Rob Goud

Feel free to reach out to me via rob.goud@tiobe.com

The results are now available for checking here: OpenCV TiCS results

I would like to share some bugs with you, which I will do in my next post, as I can only provide 2 links per post in this forum.

Much more interesting stuff is found.
Please reach out to me if you want to understand more about the TiCS measurements and results.

We are happy to provide you a short demonstration on the results.

If you would like to have TiCS run on another one of your projects, let me know too.

Looking forward to your feedback.
Rob

This forum is mostly frequented by the community, rarely by those in charge of the project.

For official contact, please try the contact form: Contact Us - OpenCV

Thanks for the information on this! Still perhaps the community using the tool are also interested to know about the possibility on Code Quality analysis using TiCS. So I will just also post these insights here.

Using the Coverity tool of Blackduck the tool has found a path through the code where Dereferencing null pointer “colsdata” happens.

Possible Null pointer derefence in calibration.cpp

In the annotated source, use the Trace button in the “violation badge” to see the full Coverity trace on how it concluded this problem in the code.

Another one:

Using the Coverity tool of Blackduck we’ve found a complex cascasing if-then-else construction in agast_score.cpp where an if and else branch are identical

Cascading if-then-else in agast_scope.cpp

Having these identical branches typically indicates a copy-paste mistake and is a bug. If the branches are supposed to be identical, then the construction can be made a little less complex

Using the Coverity tool of Blackduck we’ve found a function call in matchers.cpp where the arguments are swapped

Swapped arguments in matchers.cpp

The positions of arguments in the constructor for “cv::DMatch” do not match the ordering of the parameters:

  • “m0.trainIdx” is passed to “_queryIdx”.
  • “m0.queryIdx” is passed to “_trainIdx”.
    This most likely is a bug and unexpected behaviour happens.

and that there is a prime example of these analysis tools only seeing the code but not understanding it.

the code does bidirectional matching, hence the swap.

that entire module is a train wreck resembling the output of a PRNG. it’s more likely to get deported to the contrib repo, or axed entirely, before anyone would bother making sense of it.

Following up on the earlier issues I posted, I’d like to share some overall feedback on two key quality aspects that stand out in the OpenCV codebase: cyclomatic complexity and code duplication.

  • Cyclomatic Complexity: The project scores a D (65.62%), with an average of 3.67 paths per function. Although acceptable overall, a number of files are significantly more complex, which makes them harder to understand, test, and maintain.
    Files with highest complexity

  • Code Duplication: The project scores an F (39.40%), with 10.34% of the code duplicated. This can drive up maintenance costs, as fixes or changes need to be replicated across multiple places, increasing the risk of inconsistencies.
    Duplication hotspots

This isn’t meant as criticism — OpenCV is a fantastic project with huge impact — but as constructive input on areas that could be improved in the long run. Reducing complexity and duplication would make the code easier to understand, safer to evolve, and cheaper to maintain.

Curious to hear whether these points resonate with the maintainers’ current priorities and if there are already initiatives underway in these areas.

Best,
Rob