=Paper=
{{Paper
|id=Vol-3745/paper24
|storemode=property
|title=Are Disruptive Patents Less Likely to be Granted? Analyzing Scientific Gatekeeping with USPTO Patent Data (2004-2018)
|pdfUrl=https://ceur-ws.org/Vol-3745/paper24.pdf
|volume=Vol-3745
|authors=Lihan Yan,Haochuan Cui,Cheng-Jun Wang
|dblpUrl=https://dblp.org/rec/conf/eeke/YanCW24
}}
==Are Disruptive Patents Less Likely to be Granted? Analyzing Scientific Gatekeeping with USPTO Patent Data (2004-2018)==
Are Disruptive Patents Less Likely to be Granted? Analyzing Scientific Gatekeeping with USPTO Patent Data (2004-2018) Lihan Yan1,2 , Haochuan Cui3,* and Cheng-Jun Wang1,2,* 1 Laboratory of Data Intelligence and Interdisciplinary Innovation, Nanjing University, Nanjing, 210023, China 2 School of Journalism and Communication, Nanjing University, Nanjing, 210023, China 3 School of Information Management, Nanjing University, Nanjing, 210023, China Abstract How does scientific gatekeeping in the patent examination system affect disruptive innovation? Although the patent system was established to safeguard innovation, previous research implies that disruptive innovation faces stronger challenges in gaining recognition. To open the black box of scientific gatekeeping, we analyze the dataset of the US Patent and Trademark Office between 2004 and 2018. Findings show that disruptive innovation is detrimental to patent approval, whereas examiner workload and work experience can enhance it. Moreover, examiner workload mitigates the negative impact of disruptive innovation on patent approval, while examiner work experience can amplify the impact of examiner workload on patent approval. This study contributes to the science of science by unveiling the seemingly contradictory gatekeeping logic of patent examiners. The implications help design a more innovation- friendly incentive mechanism for scientific gatekeeping. Keywords disruption innovation, examiner workload, examiner work experience, scientific gatekeeping 1. Introduction gatekeeping within the patent examination system pro- mote or suppress disruptive innovation? Despite the patent examination system intended to safe- We draw our research on the theories of scientific guard innovation, it may pose formidable hurdles for gatekeeping, analyzing 4.5 million patents (2006–2013) disruptive innovations striving for acknowledgment. De- of United States Patent and Trademark Office’s (USPTO) signed by the government to protect innovative tech- dataset, and build a citation network according to the nologies [1], an important task for patent examiners is dataset with network analysis methods. We define dis- to identify innovative patent applications based on prior ruption innovation as a leap or break with the traditional submissions [1]. Serving as impartial third parties, patent knowledge structure [5], and quantify disruptive inno- examiners are expected to offer comparatively objective vation by the CD index five years after the publication assessments of the quality of patents. However, disrup- year of each patent[7]. To explore the bias in the patent tive innovation faces many challenges in terms of its approval process, we focus on two key characteristics of scientific impact and acceptance. Kuhn posits that in- patent examiners, namely workload and work experience. novation is a form of anomaly, and truly understanding Then, we use mixed effect models and propensity score such groundbreaking works, which challenge established weighting (PSW) to construct regression models and test paradigms, often demands a substantial amount of time the hypotheses. [2]. Prior research shows that disruptive innovation is We claim that disruptive innovation has a negative im- risky and hard to pay off [3, 4, 5]. Noh and Lee, in their pact on the patent approval, and examiner workload can analysis of patents within the telecommunications field, reduce the impact of disruptive innovation on the patent suggest that disruptive innovations often struggle to cap- approval. Examiner workload and Examiner work experi- ture the attention of examiners due to their significant ence both have a positive impact on the patent approval, deviation from existing technologies[6]. Thus, we for- and examiner work experience can amplify the effect of mulate the key puzzlement of this study: does scientific examiner workload on the patent approval. Addition- ally, granted patents means more patent citations, which Joint Workshop of the 5th Extraction and Evaluation of Knowledge helps knowledge flow and technology spillover. This Entities from Scientific Documents and the 4th AI + Informetrics (EEKE- AII2024), April 23 24, 2024, Changchun, China and Online study contributes to the science of science by unveiling * Corresponding author. the seemingly contradictory gatekeeping logic of patent $ 602022110022@smail.nju.edu.cn (L. Yan); hcui94@hotmail.com examiners towards disruptive innovations. (H. Cui); wangchj@126.com (C. Wang) YAN-Lihan.github.io (L. Yan) 0000-0003-3057-1763 (L. Yan); 0000-0001-9686-4265 (H. Cui); 0000-0002-8356-8528 (C. Wang) © 2024 Copyright 2024 for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0). CEUR ceur-ws.org Workshop ISSN 1613-0073 Proceedings 150 2. Literature Review disruptive innovation. Additionally, patents featuring disruptive innovation often involve interdisciplinary as- 2.1. Disruptive Innovation and Patent pects, which might not entirely conform to the antici- Approval pated knowledge framework. This implies that reviewing patents involving disruptive innovation is relatively less Disruptive innovation indicates a leap or a break with challenging for these experienced examiners. Moreover, the traditional knowledge structure [5], which is quite rejecting disruptive patents requires finding specific rea- essential in the progress of science. However, normal sons, such as a significant gap from the current technol- science tends to explain existing problems and expand ogy [6], which needs more time to do this kind of work. based on traditional knowledge rather than breaking However, the time constraints caused by workload make out of the existing knowledge framework for innovation it relatively challenging for examiners to achieve this. (Kuhn, 1962). The same thing happens with patents even Therefore, we propose the hypothesis as follows: patents are used to protect innovation by the government. H3: Examiner work Experience (a) and examiner A patent that introduces a groundbreaking and disruptive workload (b) can reduce the negative impact of disruptive innovative idea may struggle to attract attention because innovation on patent approval. it is significantly different from existing technologies [6]. The accumulation of work experience enables examin- Moreover, some patents with a high degree of disruptive ers to gradually form personalized work routines, which innovation may be accompanied by technical boundary diminishes their susceptibility to workload. Accumulated spanning [6], which requires the examiner to do more work experience enables patent examiners to conduct back-and-forth work with the patent office, increasing examinations with greater efficacy and efficiency, em- the difficulty of examination and adversely affecting the powering them to better manage time constraints[19]. granting result [8]. Therefore, we propose the hypothesis On the contrary, less experienced examiners are more as follows: prone to relying heavily on prior patents in their patent H1: Disruptive innovation has a negative effect on examination process [15], which amplifies the positive patent approval. effect of workload on grant approval. In all, examiners’ work experience mitigates the impact of their workload 2.2. Patent Examiner and Patent Approval on patent approval. Thus, we propose the hypothesis as follows: With the increasing workload, patent examiners are re- H4: Examiner work experience can mitigate the posi- quired to review a greater number of patent applications tive effect of examiner workload on patent approval. within a fixed timeframe, which affects the patent granted and patent quality. Rejecting a patent takes more time than accepting one [9, 10]. If examiners do not have suf- 3. Method ficient time to thoroughly review all relevant prior art for each application to find if they meet the novelty, then 3.1. Data granting patents to applications that should have been rejected is more likely to occur [11, 12]. Moreover, the ex- We use the USPTO Patent dataset to obtain the basic in- perience of examiners inevitably varies significantly at a formation about patents (2004-2018). In order to calculate specific point in time or concerning a particular group of the work experience of examiners and CD5 accurately, patents, influencing the quality and outcome of patents we analyze 200 thousand patents from 2006 to 2013 after granted [13]. The increase in the examiner’s work expe- data merging and cleaning. rience will make them inclined to grant a patent. Mann suggests that an increase in work experience may insti- 3.2. Measures gate a "burnout" effect, and result in an escalated work- 3.2.1. Dependent variables load, which links to a higher rate of patents granted [14]. Therefore, we propose the following hypothesis: Patent Approval. Patent Approval is a dummy variable H2: Examiner workload (a) and examiner work expe- that refers to the status of the given patent whether be rience (b) has a positive effect on patent approval. granted or not. This variable takes the value 1 if the As the experience and workload of an examiner in- patent is granted and 0 if it is rejected. creases, they are more inclined to grant patents [15], which may consequently result in a relatively higher ap- 3.2.2. Independent variables proval rate for patents involving disruptive innovation. If an experienced examiner conducts the review, their rel- Disruptive Innovation. Following the tradition of prior atively reduced focus on existing technology [15] might research [17, 18], we calculate the D-score of disruption lead to a more lenient assessment of patents involving for each patent as follows: 151 Table 1 Correlation Matrix of Key Variables Disruptive Patent - Patent - Examiner Examiner Work Innovation Approval Citations Workload Experience Disruptive Innovation Patent Approval -0.038*** Patent Citations -0.102*** 0.035*** Examiner Workload -0.057*** 0.229 *** 0.040*** Examiner Work Experience -0.049*** 0.042*** -0.080*** 0.205*** Note:*p < 0.1; **p < 0.05; ***p < 0.001 the higher the disruptive potential of a patent, the greater 𝑛𝑖 − 𝑛𝑗 the difficulty in obtaining a grant. Therefore, H1 is well 𝐷= , (1) 𝑛𝑖 + 𝑛𝑗 + 𝑛𝑘 supported. where 𝑛𝑖 is the number of subsequent papers that cites According to the results of Model 2-4 in Table 2, both the focal paper, 𝑛𝑗 is the number of subsequent papers examiner work experience and examiner workload have that cite both the focal paper and its references, and 𝑛𝑘 is a positive impact on the patent granted. In other words, the number of subsequent papers that only cites the focal the shorter the tenure of examiners and the greater their paper’s references. However, the measure of disruption workload, the likelihood of patents being accepted tends D tends to be underestimated in the first few years (Lin et to increase. Therefore, H2(a) and H2(b) are well sup- al., 2022). Therefore, we calculate disruptive innovation ported. based on citations of the focal paper over a 5-year time As Model 5 shows in Table 2, firstly, the moderation window (CD5). Because the distribution of disruption is effect of Examiner Work Experience is not significant. also highly skewed, we use the CD5 percentile (M = 0.59, Thus H3(a) is rejected. Secondly, Examiner Workload has SD = 0.35) to measure the disruptive innovation of the a moderate effect on the relationship between Disruptive patent. Innovation and Patent Approval, reducing the negative Examiner Workload. Examiner workload refers to impact of Disruptive Innovation on the Patent Approval how much of the burden of other patents is assigned to (as shown in Figure 1). Furthermore, the result of simple the examiner when they evaluate the focal patent. We slope analysis reveals that when the values of workload weighted patents in the period between the filing date are at -1 SD, Mean, and +1 SD, their slopes are -0.40 (t = of the focus patent and the date of grant or rejection to -23.78, p < 0.001), -0.22 (t = -23.78, p < 0.001), and -0.04 (t = make the calculation more accurate based on the work -23.78, p = 0.16), respectively. It means that for examiners of Funk and Owen-Smith [17]. with more work, the probability of rejecting a disruptive Examiner Work Experience. Examiner work expe- patent is relatively smaller. Therefore, H3(b) is supported. rience is the number of years the examiner has worked Thirdly, Examiner Work Experience moderates the effect for USPTO. We exclude the examiner appearing in the of Examiner Workload on Patent Granted. The result first 2 years of the dataset to calculate more accurately of simple slope analysis reveals that when the values of (M = 3.09, SD = 1.82). workload are at -1 SD, Mean, and +1 SD, their slopes are 0.86 (t = 71.48, p < 0.001), 1.03 (t = 105.06, p < 0.001), and 1.20 (t = 82.84, p < 0.001), respectively. When the 4. Findings examiner workload is less than approximately 4.5, higher examiner work experience is associated with a lower The key puzzlement of this research focuses on the rela- probability of patent approval at the same workload level. tionship between Disruptive Innovation, Patent Granted, Therefore, H4 is only partially supported. and Patent Examiners. To begin, we report the correla- tion matrix of the key variables in Table 1. We make use of mixed effect model to test research 5. Conclusion hypotheses 1-4 (see Table 2), which is related to the relationship between disruptive innovation, examiner In summary, this study aims to elucidate the relation- work experience, examiner workload, and patent granted. ship between disruptive innovation, patent examiners, As Table 2 shows, the results indicate a negative impact and granted patents, investigating factors influencing of disruptive innovation on the patent granted, that is, patent approval including disruptive innovation, exam- 152 Table 2 Mixed Effect Model and Interaction Effect on Patent Approval Patent Approval Model 1 Model 2 Model 3 Model 4 Model 5 Disruptive Innovation -0.400*** -0.372*** -1.828*** Examiner Workload 1.150*** 1.527*** 1.232*** Examiner Work Experience 0.090*** 0.083*** -0.068*** Disruptive Innovation * Examiner Workload 0.322*** Disruptive Innovation * Examiner Work Experience -0.014 Examiner Workload * Examiner Work Experience 0.034*** Control variables Yes Yes Yes Yes Yes Team Size Yes Yes Yes Yes Yes References Yes Yes Yes Yes Yes Number of Labels Yes Yes Yes Yes Yes IPCR Labels Yes Yes Yes Yes Yes Year Yes Yes No No No Country Yes Yes Yes Yes Yes Random effect Examiner ID Yes Yes Yes Yes Yes Constant 0.188 -5.958*** -0.316*** -6.619*** -5.288*** Log Likelihood -534,028.700 -518,339.500 -125,823.100 -119,516.800 -119,452.800 Akaike Inf. Crit. 1,068,117.000 1,036,739.000 251,692.200 239,083.700 238,961.700 Bayesian Inf. Crit. 1,068,471.000 1,037,092.000 251,927.600 239,339.600 239,248.200 Note: * p < 0.1; ** p < 0.05; *** p < 0.001 0.8 provide additional evidence from the gatekeeping per- spective that disruptive innovation faces difficulties in gaining acceptance in the scientific field [2]. Second, we Patent Approval 0.7 Examiner Workload explore the bias of examiners from the aspects of work- + 1 SD Mean − 1 SD 0.6 load and work experience, thereby shedding light on 0.5 the black box of the gatekeeping process by contrasting granted and rejected patents. Third, the mechanisms by −2 0 2 Disruptive Innovation Figure 1: The Moderation Effect of Examiner Workload on which innovation is either fostered or hindered during Patent Approval the gatekeeping process help better understand and en- hance the existing patent examination system’s ability to safeguard innovation. 1.00 We acknowledge the limitations of this study, which 0.75 provide some insights and directions for future research. First, we lack examination opinion data that detail the Patent Approval Examiner Work Experience + 1 SD reasons for patent rejections. If specific examination 0.50 Mean − 1 SD 0.25 opinions are available, it would enable exploration of 0.00 more precise gatekeeping mechanisms. Second, when measuring the impact of a patent, we have only consid- 2 4 6 Examiner Workload Figure 2: The Moderation Effect of Examiner Work Experience ered patent citations and have overlooked the influence on Patent Approval of academic papers. Third, demographic factors of patent examiners e.g., gender and age) which could influence their decision-making processes and potential biases, are not included in the analysis. iner workload, and experience, while also exploring the impact of granted patents on citations. This study has significant theoretical and policy implications. First, we 153 References Characteristics (w8980; w8980). National Bureau of Economic Research. https://doi.org/10.3386/w8980 [1] Meyer, M. (2000). What is special about patent [14] Mann, R. J. (2013). The Idiosyncrasy of Patent Ex- citations? Differences between scientific and aminers: Effects of Experience and Attrition. Texas patent citations. Scientometrics, 49(1), 93–123. Law Review, 92, 2149. https://doi.org/10.1023/A:1005613325648 [15] Lemley, M. A., & Sampat, B. (2012). Examiner Char- [2] Kuhn, T. (1962). The structure of scientific revolu- acteristics and Patent Office Outcomes. The Re- tions 1st Edition. Chicago: Univ. Chicago. view of Economics and Statistics, 94(3), 817–827. [3] Foster, J. G., Rzhetsky, A., & Evans, J. A. (2015). Tra- https://doi.org/10.1162/REST_a_00194 dition and Innovation in Scientists’ Research Strate- [16] Kim, Y. K., & Oh, J. B. (2017). Examination work- gies. American Sociological Review, 80(5), 875–908. loads, grant decision bias and examination quality https://doi.org/10.1177/0003122415601618 of patent office. Research Policy, 46(5), 1005–1019. [4] Uzzi, B., Mukherjee, S., Stringer, M., & Jones, https://doi.org/10.1016/j.respol.2017.03.007 B. (2013). Atypical Combinations and Sci- [17] Funk, R. J., & Owen-Smith, J. (2017). A Dy- entific Impact. Science, 342(6157), 468–472. namic Network Measure of Technological https://doi.org/10.1126/science.1240474 Change. Management Science, 63(3), 791–817. [5] Lin, Y., Evans, J. A., & Wu, L. (2022a). New di- https://doi.org/10.1287/mnsc.2015.2366 rections in science emerge from disconnection [18] Wu, L., Wang, D., & Evans, J. A. (2019). Large and discord. Journal of Informetrics, 16(1), 101234. teams develop and small teams disrupt sci- https://doi.org/10.1016/j.joi.2021.101234 ence and technology. Nature, 566(7744), 378–382. [6] Noh, H., & Lee, S. (2020). What constitutes https://doi.org/10.1038/s41586-019-0941-9 a promising technology in the era of open [19] Shu, T., Tian, X., & Zhan, X. (2022). Patent innovation? An investigation of patent po- quality, firm value, and investor underreaction: tential from multiple perspectives. Technologi- Evidence from patent examiner busyness. Jour- cal Forecasting and Social Change, 157, 120046. nal of Financial Economics, 143(3), 1043–1069. https://doi.org/10.1016/j.techfore.2020.120046 https://doi.org/10.1016/j.jfineco.2021.10.013 [7] Park, M., Leahey, E., & Funk, R. J. (2023). Papers and patents are becoming less disrup- tive over time. Nature, 613(7942), 138–144. https://doi.org/10.1038/s41586-022-05543-x [8] Whalen, R. (2018). Boundary spanning innova- tion and the patent system: Interdisciplinary challenges for a specialized examination system. Research Policy, 47(7), 1334–1343. https://doi.org/10.1016/j.respol.2018.04.017 [9] Langinier, C., & Marcoul, P. (2020). Monetary and implicit incentives of patent examiners. Jour- nal of Economics and Business, 110, 105906. https://doi.org/10.1016/j.jeconbus.2020.105906 [10] Schuett, F. (2013). Patent quality and incentives at the patent office. The RAND Journal of Eco- nomics, 44(2), 313–336. https://doi.org/10.1111/1756- 2171.12021 [11] Caillaud, B., & Duchêne, A. (2011). Patent office in innovation policy: Nobody’s perfect. International Journal of Industrial Organization, 29(2), 242–252. https://doi.org/10.1016/j.ijindorg.2010.06.002 [12] Frakes, M. D., & Wasserman, M. F. (2014). Is the Time Allocated to Review Patent Applications Induc- ing Examiners to Grant Invalid Patents?: Evidence from Micro-Level Application Data (Working Pa- per 20337). National Bureau of Economic Research. https://doi.org/10.3386/w20337 [13] Cockburn, I., Kortum, S., & Stern, S. (2002). Are All Patent Examiners Equal? The Impact of Examiner 154