My thinking about promoting AI further development
Please bear me and point out if I said something wrong and glad to hear your voice). In the past five years, a series of Transformer-based models has been created and relevant works have been done. Pre-trained large language models with few-shot prompting becomes the new paradigm for tackling a broad range of NLP-related tasks. This is amazing and really useful for NLP applications. But no significant improvement of model architecture (algorithm side) has been done; everything is still transformer-based.
Nov-27-2022, 03:46:09 GMT
- Technology: