GitHub - booknlp/booknlp: BookNLP, a natural language processing pipeline for books
The larger and more accurate big model is fit for GPUs and multi-core computers; the faster small model is more appropriate for personal computers. See the table below for a comparison of the difference, both in terms of overall speed and in accuracy for the tasks that BookNLP performs. To explore running BookNLP in Google Colab on a GPU, see this notebook. If using a GPU, install pytorch for your system and CUDA version by following installation instructions on https://pytorch.org. This runs the full BookNLP pipeline; you are able to run only some elements of the pipeline (to cut down on computational time) by specifying them in that parameter (e.g., to only run entity tagging and event tagging, change model_params above to include "pipeline":"entity,event").
Dec-18-2021, 16:45:22 GMT
- Technology: