[ Information ] [ Publications ] [Signal processing codes] [ Signal & Image Links ] | |
[ Main blog: A fortunate hive ] [ Blog: Information CLAde ] [ Personal links ] | |
[ SIVA Conferences ] [ Other conference links ] [ Journal rankings ] | |
[ Tutorial on 2D wavelets ] [ WITS: Where is the starlet? ] | |
If you cannot find anything more, look for something else (Bridget Fountain) |
|
|
"Indicators are meaningless. They only may spark discussion"; "Les indicateurs n'ont aucun sens, ils ne peuvent servir qu'à initier une discussion"
I do believe that most scientific performance indicators (impact factor, h-index, Erdös number) have no absolute meaning. Maybe absolutely no meaning at all. Their computation (assumptions, algorithms, database, counts) is somehow obscure, often flawed, and influenced by disputable practices like author and journal, conscious or imposed, self-citations. Many folks (especially those with power) nevertheless trust and worship them. To understand their construction, to observe their evolution over time, and to assess their relative performance with respect to your publishing goals is thus a side work in the process of scholarly publishing. Here are graphs comparing Elsevier SCImago Journal Ranking and a Thomson-Reuters-like Impact Factor (IF) with signal, image, video processing links. Google Scholar also provides an (Hirsch) h-index for Top publications - Engineering & Computer Science, Top publications - Chemical & Material Sciences, Top publications - Physics & Mathematics or Top publications - Life Sciences & Earth Sciences, where you can see that arXiv, though unrefereed performs quite well. You may as well check SIVA Conferences (no ranking).
To add: