The last two years of my PhD were unique in that I worked as an embedded
security researcher within the IT organization at UCSD. I worked side by side
with many of the security practitioners on the team, and had access to a lot of
rich (and noisy) enterprise data. This was a super eye-opening and career
changing experience for me -- in fact it was this experience, with some other
existential factors, that convinced me that the next right career step for me
wasn't as a research professor, but as an industry researcher.
When I started this role with IT, my directive was to help the IT
security org with whatever issues they were facing. That means that a
lot of the problems I ended up working on[1]were focused on practicality. All the
projects had a very concrete end goal of answering a question that the
organization could then act on to improve a given process.
This role ended up being a dream for me. Not only did I get to work with
some fantastic people, but I also got to use my powers for good and
employ research to answer very practical, pertinent questions that would
have an immediate effect on the organization itself. I had an inside
view from an enterprise setting and I'd get to share our
findings publicly with others[2].
However, the road to sharing our findings wasn't as easy as I thought. While I
am currently writing this on my way to present one of these
projects at ACSAC 2023[3],
this paper had previously been rejected from two other security venues. In both of
these cases, we submitted the paper in a similar form[4] and in both cases we got neutral to positive feedback that said
they enjoyed our measurement and methods, thought the paper was well written,
but didn't see the "novelty" in the idea or why it mattered to the security
community.
This feedback was heartbreaking to me[5]. This work had real impact on the organization! It
showed some things worked well, and others didn't![6] It was a large-scale analysis from the
perspective of an enterprise! When I tried to convey this in rebuttals, I would
get a very lukewarm rejection because reviewers often admitted that the analysis
itself was solid, but worried about it's novelty.
As a measurement person, I understand and appreciate that there are different
metrics that can be used to define research. Novelty is one metric for
this[7], but
can also be incredibly vague and an easy fallback when the paper doesn't "feel
right for this conference". The novelty argument frustrates me so much that I
ended up looking at the Merriam Webster's dictionary definition
for research[8] and the first definition of research is "studious inquiry or
examination" followed by "especially : investigation or experimentation
aimed at the discovery and interpretation of facts, revision of accepted
theories or laws in the light of new facts, or practical application of such
new or revised theories or laws"[9].
So let me ask you this: What could happen if we allowed our definition of
research to shift away from novelty? What if we allowed more practical applications of research to
find their way into the academic realm? What if by doing so, we showed that the
academic security community values collaboration with industry, IT orgs, and
non-profits, thus spurring more on-the-ground research that then finds it's way
into the public discourse? What could a world like this look like?
I realize this might be opening Pandora's box. The number of papers that are
submitted to security conferences is already so large, and only increases every
year -- I know because I've been a reviewer at some of these conferences. By
loosening the definition, we allow the possibility of paper submissions to
increase, thus potentially making our task burdened lives even more burdened.
AND YET. What if the benefits outweighed the costs?
A gal can dream.