Transformer-Based Patent Novelty Search by Training Claims to Their Own Description

Michael Freunek, André Bodmer


In this paper we present a method to concatenate patent claims to their own description. By applying this method, bidirectional encoder representations from transformers (BERT) train suitable descriptions for claims. Such a trained BERT could be able to identify novelty relevant descriptions for patents. In addition, we introduce a new scoring scheme: relevance score or novelty score to interprete the output of BERT. We test the method on patent applications by training BERT on the first claims of patents and corresponding descriptions. The output is processed according to the relevance score and the results compared with the cited X documents in the search reports. The test shows that BERT score some of the cited X documents as highly relevant.

Full Text:




  • There are currently no refbacks.

Paper Submission E-mail:

Applied Economics and Finance    ISSN 2332-7294 (Print)   ISSN 2332-7308 (Online)

Copyright © Redfame Publishing Inc.

To make sure that you can receive messages from us, please add the '' domain to your e-mail 'safe list'. If you do not receive e-mail in your 'inbox', check your 'bulk mail' or 'junk mail' folders. If you have any questions, please contact: