Recent Changes - Search:


edit SideBar


Novelty Vs New Contribution To Science.

Have you ever noticed how the call for conference contributions generally includes a comment to the effect that "novel pieces of work" should be submitted, and how when attending the conference there are often multiple approaches to solving the same problem, many presented in a way which emphasises the novelty of the approach? Clearly, we would now want people to continually re-submit the same piece of work and we want to ensure that researchers are describing new ideas. However, I believe that the current reliance upon novelty is not a good thing. Particularly if the definition of novelty means only "no one has tried it before", as currently seems the case. Understanding why I think this requires a bit of explanation.

Many papers, which claim to have a novel approach, motivate novelty in terms of some new analysis principle. In particular, methods based upon conventional statistics seem to be regarded as uninteresting. However, suggesting a new principle for data analysis which does not come down in the end to being equivalent to conventional probability theory is unlikely to stand the test of time. Imagine this example; someone publishes an algorithm based upon some "new" analysis principle. Then someone else comes along and shows how the new algorithm could actually have been derived from conventional probability theory (generally likelihood) on the basis of a few assumptions regarding the statistical nature of the data. Now, do we accept that these assumptions are actually those being made, and the algorithm is after all another example of likelihood, or do we maintain that this is a new principle for data analysis. While this scenario may seem unlikely to those new to research areas involving image (and signal) processing, it is my observation that this scenario has played itself out many times, in various forms, in the literature, over a broad range of subjects. There has only ever been one outcome, with the research community eventually loosing interest when it becomes clear that the new technique was not that "novel" after all, and could have been straightforwardly designed on the basis of turning the handle with standard statistical methods. Particularly, if it becomes apparent that the required assumptions are unrealistic, no-one can support the position that it is a good idea any more. Many even have an aversion to standard analysis techniques altogether, and as soon as the phrase "likelihood" hoves in to view the method is immediately disowned, almost as if its very familiarity would imply that its use can never be novel.

It has actually been remarked to me that if someone succeeded in developing a new theory of data analysis which was not probability theory, it would of course be worth a Nobel prize. This probably gives a good indication of how important it would be, and also how unlikely it is to happen. Unfortunately, once all of the obvious ways to analyse data have been published, the only novel methods left are those that probably have no solid foundation. Of course this kind of error could be avoided if researchers became familiar with the standard statistical methods and their real limitations, but the natural aversion of the field to conventional approaches makes this unlikely. This is why I do not like an emphasis on novelty.

I think it is time for all of us to reconsider our idea of novelty. Instead I would go back to the original definition of scientific progress, a contribution to knowledge. A contribution to science simply requires that there are some solid results in the publication which would be of quantitative use to others trying to achieve the same task, independent of datasets or unspecified implementation details. For example, clear identification of the assumptions behind simple algorithms, tests which provide transferable performance figures, derivations of principled statistical methods which provide valid solutions. And if some of these are based upon the old chestnut of likelihood, why not, provided it is valid and justifiable. All of this is valid as a new (novel) contribution, if it has not been previously published. Novelty should not have to mean "we don't know why anyone would expect this to work yet".

The role the emphasis of novelty has had on published work is not to be underestimated. Work is quite often very much more prevalent in the literature if it contains a good piece of mathematics, and the more straightforward approaches are quite often dull by comparison. Who, for example, would not prefer a closed form solution to the camera calibration problem (making all sorts of unrealistic demands on the data set) rather than a statement that the best approach requires robust statistics, which must inevitably mean an iterative (or brute force) solution. I believe many of the good (valid) pieces of work in the field, which are based upon solid statistical principles and can be shown to make realistic assumptions regarding the data, have simply been buried under a mass of novelty. It is for this reason that we have started this wiki page. If you know of a piece of work (preferably not your own) that appears to be under-represented in your field, why not put it here?

NAT 21/2/2005

Edit - History - Print - Recent Changes - Search
Page last modified on March 05, 2012, at 04:02 PM