Computer Security Conference Ranking and Statistic
Guofei Gu
Ranking
Note:
- How to judge how good a conference is? In my opinion, here are several criteria:
-
Acceptance ratio: definitely an important metric (maybe the easiest metric that can be quantified), but not the only metric
-
Paper quality and impact: how many classic papers are from this conference? how much impact do papers from this conference have on the community? are they well cited and studied?
-
Committee member quality: what's the quality of TPC members? are they noted researchers in this area? This is an important factor because they will affect the quality of submission (good papers will prefer to submit to the conference with noted researchers in the committee), and control the quality of accepted papers.
-
Attendee/Paper number ratio: another quantified metric. This somehow reflects the influence of this conference on the community
-
Location: a beautiful place has some attraction. In addition, many researchers (but not all) are not very willing to travel to other countries due to limited funding or time (or VISA problem...), so they just submit papers to local conferences. Thus, normally the conferences located in USA are better than in Europe, which is also better than in Asia.
-
History: a conference with a long history may have a good tradition and reputation
-
Industry connection: this somehow reflects the impact on the industry. Normally, conferences on more applied techniques will attract more industry partners (so have more money to improve the quality of conference)
- This ranking list is only in my opinion. It is not official, nor accurate, only for reference. Some good workshops are also included.
- I'm probably biased because I'm mainly a network/system security researcher. Notify me if you have different views, or you find significant conferences missing/bias. I'm more than happy to hear from you.
- Some conferences are arguably better belonging to rank 1.5 or 2.5. At this time, I'm not going to differentiate these. Let me know if you have some strong comments.
- Long time ago, I maintained a general computer science conference ranking (a mirror of my previous Georgia Tech page).
Rank 1
S&P (Oakland)
IEEE Symposium on Security and Privacy
CCS
ACM Conference on Computer and Communications Security
Crypto
International Cryptology Conference
Eurocrypt
European Cryptology Conference
Security
Usenix Security Symposium
NDSS
ISOC Network and Distributed System Security Symposium
Rank 2
ESORICS
European Symposium on Research in Computer Security
RAID
International Symposium on Recent Advances in Intrusion Detection
ACSAC
Annual Computer Security Applications Conference
DSN
The International Conference on Dependable Systems and Networks
CSF (CSFW)
IEEE Computer Security Foundations Symposium.
Supersedes CSFW (Computer Security Foundations Workshop)
TCC
Theory of Cryptography Conference
Asiacrypt
International Conference on the Theory and Application of Cryptology and Information Security
IMC
Internet Measurement Conference
Rank 3
SecureComm
IEEE Communications Society/CreateNet Internation Conference on Security and Privacy for Emerging Areas in Communication Networks
DIMVA
GI SIG SIDAR Conference on Detection of Intrusions and Malware and Vulnerability Assessment
AsiaCCS
ACM Symposium on Information, Computer and Communications Security
ACNS
International Conference on Applied Cryptography and Network Security
FC
International Conference on Financial Cryptography and Data Security
SAC
ACM Symposium on Applied Computing
ACISP
Australasia Conference on Information Security and Privacy
ICICS
International Conference on Information and Communications Security
ISC
Information Security Conference
ICISC
International Conference on Information Security and Cryptology
SACMAT
ACM Symposium on Access Control Models and Technologies
CT-RSA
RSA Conference, Cryptographers' Track
SEC
IFIP International Information Security Conference
WiSec
(WiSe, SASN)
ACM Conference on Wireless Network Security
Supersedes WiSe (ACM Workshop on Wireless Security) and SASN (ACM Workshop on Security of Ad-Hoc and Sensor Networks)
SOUPS
Symposium On Usable Privacy and Security
IFIP WG 11.9
IFIP WG 11.9 International Conference on Digital Forensics
-- Workshops below --
DFRWS
Digital Forensic Research Workshop
FSE
Fast Software Encryption workshop
PKC
International Workshop on Public-Key Cryptography
NSPW
New Security Paradigms Workshop
IH
Workshop on Information Hiding
WSPEC
Workshop on Security and Privacy in E-commerce
DRM
ACM Workshop on Digital Rights Management
IWIA
IEEE International Information Assurance Workshop
IAW
IEEE SMC Information Assurance Workshop "The West Point Workshop"
CHES
Workshop on Cryptographic Hardware and Embedded Systems
SRUTI
USENIX Workshop on Steps to Reducing Unwanted Traffic on the Internet
HotSec
USENIX Workshop on Hot Topics in Security
LEET
(HotBots,WORM)
USENIX Workshop on Large-scale Exploits and Emergent Threats
Supersedes HotBots (USENIX Workshop on Hot Topics in Understanding Botnets) and WORM (ACM Workshop on Recurring/Rapid Malcode)
Others (not ranked)
ISPEC
Information Security Practice and Experience Conference
Acceptance Ratio Statistic
Note:
- This could be the most complete (and accurate?) list of computer security conference statistic you can find on Internet. There are still some blanks in this list. If you have any reliable source of these information (or some other security conferences), please email me (guofei AT cs.tamu.edu). Thanks.
- You can see from the following statistic, security conferences are becoming harder and harder to get in in the recent five years! And I think this trend will possible continue in the near future.
Year
Rank 1
Rank 2
Rank 3
Year
IEEE S&P
ACM CCS
USENIX Security
NDSS
CSF/CSFW
ESORICS
RAID
ACSAC
IMC
SecureComm
DIMVA
AsiaCCS
ACNS
SOUPS
DFRWS
2010
15.4%(24/156)
2010
2009
10% (26/253)
18.4%(58/315)
14.7%(26/176)
11.7% (20/171)
19.1%(42/220)
19.6%(44/224)
25.3% 19/75,full paper)
34.7%(26/75)
29.5%(13/44)
22.4% (33/147, regular)
27% (40/147)
21% (32/150)
2009
2008
11.2% (28/249)
18%(51/281)
15.9% (27/170)
17.8% (21/118)
18%(21/115)
22%(37/168)
25%(20/80)
24%(42/173)
33%(14/42)
17.6% (32/182,full)
22.5% (41/182)
22.9%(30/131)
28%(12/43)
39.5%(17/43)
2008
2007
8% (20/246,full paper)
12% (29/246)
18% (55/303)
12.3% (23/187)
14.4% (18/125)
25%(25/101)
23.8%(39/164)
18% (18/100)
22%(42/191)
26%(31/119)
24.6%(14/57)
18.3% (33/180,full paper)
29.4% (53/180)
12% (31/260)
31.7%(13/41)
47.2%(17/36)
2007
2006
9.2% (23/251, full paper)
12.7% (32/251)
14.8%(38/256)
12.3% (22/179)
13.4% (17/127)
24.5%(25/102)
20%(32/160)
17% (16/93)
30%(40/134)
21% (19/92, full paper track)
25.4%(32/126)
26.8%(11/41)
17.7% (33/186) Start from 2006
15.1% (33/218)
36%(14/39)
44.4%(16/36)
2006
2005
8.9% (17/192)
15.2% (38/250)
14.8% (22/149)
12.9% (16/124)
20.8% (20/96)
17.0% (27/159)
20.5% (17/83)
22.8% (45/197)
27% (22/82, full paper)
22.2% (32/144, full paper)
36.1% (52/144)
Start from 2005
27%(14/51)
22.2% (35/158)
10/39
2005
2004
10.2% (19/186)
13.9% (35/251)
12.0% (22/184)
16.3% (16/98)
? (20/?)
17.0% (27/159)
13.5% (16/118)
26.1% (35/134)
19% (19/98,full paper)
34%(14/41)
Start from 2004
12.1% (36/297)
2004
2003
14.5% (19/131)
13.8% (35/253)
16.4% (21/128)
20.5% (17/83)
? (17/?)
16.7% (19/114)
29.5% (13/44)
26% (19/73, full paper)
16.8% (32/191)
Start from 2003
2003
2002
22.1% (21/95)
17.6% (27/153)
16.9% (22/130)
19.0% (15/79)
27.4% (23/84)
19.3% (16/83)
25.0% (16/64)
24% (15/62, full paper)
2002
2001
17.8% (19/107)
17.6% (27/153)
28.9% (24/83)
24.2% (16/66)
38.9% (21/54)
not held
21.8% (12/55)
26% (14/53), Start from 2001
2001
2000
13.1% (18/137)
21.2% (28/132)
29.4% (15/51)
45.8% (22/48)
25.3% (19/75)
53.8% (14/26)
2000
1999
24.6% (15/61)
19.3% (16/83)
40.4% (19/47)
not held
? (32/?)
1999
1998
16.4% (19/116)
20.0% (17/85)
33.3% (15/45)
40.4% (23/57)
67.3% (35/52)
Start from 1998
1998
1997
18.2% (20/110)
26.6% (17/64)
not held
not held
1997
1996
29.9% (20/67)
32.2% (19/59)
36.2% (21/58)
1996
1995
27.8% (20/72)
not held
not held
1995
1994
29.2% (19/65)
44.3% (31/70)
not held
Start from 1994?
36.6% (26/71)
1994
1993
24.3% (17/70)
45.0% (27/60)
Start from 1993
not held
1993
1992
23.6% (21/89)
? (24/?)
1992
1991
30.4% (28/92)
not held
1991
1990
Start from 1990?
? (?/?)
Start from 1990
1990
1989
1989
1988
Start from 1988
1988
1987
1987
1986
27.5% (25/91)
1986
1985
39.7% (25/63)
Start from 1985
1985
1984
64.1% (25/39)
1984
1983
67.6% (23/34)
1983
1982
55.9% (19/34)
1982
1981
1981
1980
100% (19/19)
Start from 1980
1980
Thanks the following people for information and suggestions/comments on the ranking and statistic: Mihai Christodorescu, Kevin Almeroth, Jianying Zhou, Zhiqiang Lin, Jonathan Katz, Vinod Yegneswaran, Thomas Zimmermann, Thorsten Holz, Paul A. Karger, Monirul Sharif, Ragib Hasan, Simson Garfinkel, Robin Sommer, Ton van Deursen, ... , and you.
- Acceptance ratio: definitely an important metric (maybe the easiest metric that can be quantified), but not the only metric
- Paper quality and impact: how many classic papers are from this conference? how much impact do papers from this conference have on the community? are they well cited and studied?
- Committee member quality: what's the quality of TPC members? are they noted researchers in this area? This is an important factor because they will affect the quality of submission (good papers will prefer to submit to the conference with noted researchers in the committee), and control the quality of accepted papers.
- Attendee/Paper number ratio: another quantified metric. This somehow reflects the influence of this conference on the community
- Location: a beautiful place has some attraction. In addition, many researchers (but not all) are not very willing to travel to other countries due to limited funding or time (or VISA problem...), so they just submit papers to local conferences. Thus, normally the conferences located in USA are better than in Europe, which is also better than in Asia.
- History: a conference with a long history may have a good tradition and reputation
- Industry connection: this somehow reflects the impact on the industry. Normally, conferences on more applied techniques will attract more industry partners (so have more money to improve the quality of conference)
Supersedes CSFW (Computer Security Foundations Workshop)
(WiSe, SASN)
Supersedes WiSe (ACM Workshop on Wireless Security) and SASN (ACM Workshop on Security of Ad-Hoc and Sensor Networks)
(HotBots,WORM)
Supersedes HotBots (USENIX Workshop on Hot Topics in Understanding Botnets) and WORM (ACM Workshop on Recurring/Rapid Malcode)
34.7%(26/75)
27% (40/147)
22.5% (41/182)
12% (29/246)
29.4% (53/180)
12.7% (32/251)
36.1% (52/144)
Start from 2005
Start from 2004
Start from 2003
Start from 1998
Start from 1993
Start from 1990
Start from 1980
Oakland:95分,全称IEEE Symposium on Security & Privacy,每年都在Oakland召开。
之所以不简称S&P,是为了跟一个magazine----IEEE Security & Privacy区分开来。被认
为是计算机安全的最高会议,比ACM的CCS更受尊敬。该会自称接受一切与计算机安全的文
章,但我感觉其以应用型为主,对理论性的文章尤其crypto-flavor的文章非常排斥。
CCS:92分,ACM SIGSAC的年会。该会宣称只接受practical papers,然而事实上却是安全
方面最diversified的会议,从纯粹密码学的文章到非常应用性的文章都有。传统上该会议
的politics比较严重,但今年的program committee非常强大,有望使会议质量进一步提高
。
USENIX Security:91分。USENIX是systems research方面的重要组织,主办了systems方面的若干重要会议,如OSDI(操作系统的第二会议)等。USENIX Security Symposium则是systems security的著名会议,文章基本陷于hardcore systems类型。
NDSS:90分,很好的一个关于网络和分布式系统安全的会议,偏应用型。
ESORICS:88分,欧洲的计算机安全年会。跟CCS一样广泛的范围,包容性甚至更强。
CSFW:85分,一个小型的workshop,然而在安全方面有一定影响。算是为数不多的受到尊
敬的workshops之一。
ACSAC:82分?一个纯粹应用型的安全会议,纯粹到其文章大部分都很难算作传统意义上的
research papers。但其研究的问题都非常的实用和有趣。
SACMAT:82分?Access control方面的一个比较重要的会议。
SecureComm:今年刚刚办起来的网络安全会议。从其program commitee来看,起点很高。
但能有多大影响还要过一段时间才能知道。
***********************************
CRYPTO(95分),EUROCRYPT(94分),ASIACRYPT(90分):IACR的三大年会,在“我知
道的几个理论会议”里已经写过。
TCC(87分):一个新会议,focused on 密码学理论。起点很高,但能有多大影响还要过一段时间才能知道
ACNS(84分?):密码学与网络安全结合的会议。历史很短,但接受率颇低(百分之十五
以下)。