Why Did We Assess 150 Security Researchers, 70 Conferences and How?
Computer security is fun. There are new researches each day, hacking scandals almost every month and large conferences every year. But besides that, there are problematic topics too. For example, everybody knows that DEF CON is a great conference and speaking there is a great joy and achievement. But there are thousends of security researchers around the world and DEF CON has limited speaker capacity. Because of that, there are lots of different security conferences around the world. You can submit your paper to them. But to which one? If a conference were a movie, you can just open imdb.com and check it’s score. But there is no such thing for security conferences. You submit your paper and got approval. You travel to a different country. In the conference day, you see that there are only 5 people at conference and organization owner is missing. This is bad, and this conference will be held in next year with the same structure, this is worse.
The same thing is valid for choosing a company to work with. You are looking for a company who provides penetration testing service and various of products. You saw that Acme Security has good reviews on Ga!redacted!, and you started to work with them. But in the first week, you realized that Acme Company is just overrated. But why Ga!redacted! gave them a good review? Because of money talks, not the truth.
All these problems occurs because of a one thing: There is no standardized review/scoring system in computer security scene. We built pwnhead.com as a solution for these problems.
Scoring the Conferences
We wanted to give a score to every security conference out there and publish a ranking list. To do that we created a list of conferences that we attended as a speaker or attendee and gave them a score of our experience. For the conferences that we didn’t attend, we created a couple of metrics and a formula that will determine a conference’s score. Which are:
Number of attendees.
Score of the speakers (we will come to that later)
Proper archiving on the website
As a result, we scored 70 different conferences all around the world. Conference score is named as “weight” and the maximum weight is 300. Global conference rankings can be seen in here: https://pwnhead.com/ranking/conferences/
Scoring the People
We wanted to score companies. Some websites are doing that by ranking their profits, some websites (like Ga!redacted!) are doing that by taking money from them. This is not a good way to assess companies. We also weren’t interested in financial things. We wanted to score companies on a technical point of view. Also, we believe that a company’s technical quality can only be measured with its employee’s technical skills. Therefore, we had to score people first. After scoring people, we could score a company with its employee scores.
To score a person, we had to create a metric list and a formula just like we did for conferences. So we created the following metrics:
Github statistics: If the person has a Github account, we check number of respositories number of stars etc.
The popularity of written tools: If the person wrote some security tools, we assess their popularity among the community.
Number of CVEs: CVEs are categorized according to their importance. For example code execution vulnerability in Chrome has critical importance while XSS vulnerability in software that 5 people uses on Earth has low importance.
Conference presentations. Conference’s weight affects your score.
Number of papers. Papers are categorized according to their importance. For example, a research paper published in a scientific journal will have a higher score than a conference paper.
Number of books and their popularity.
Person’s impact on the security scene. This is a subjective score given by our editors. If a person wrote a tool that became a standard (like nmap), or found a vulnerability that has a high high impact (like eternalblue), his/her impact score will be high.
We gathered 150 different security researchers from various of sources such as conference websites, Github, Twitter, Linkedin etc. After then, we analyzed each security researcher one by one. We gathered their tools, CVEs, research papers, speaking engagements, etc. and analyzed each one of them. We read their blogs, papers, we listened to their presentations, we analyzed their tools, and it took a long, long time. But as a result, we could able to score 150 different security researchers. Global security researcher ranking list can be seen in here: https://pwnhead.com/ranking/profile/
Scoring Companies and Countries
As we said before, a company’s technical quality can only be measured with it’s employee’s technical skills. The same thing is also valid for countries. Therefore, the company/country score is the average score of the top 5 people in that company/country. This number of people will be increased when we have more people on pwnhead.
As a result, pwnhead.com has been born as a non-profit website. It’s only purpose is creating good values for the computer security scene.