Your browser doesn't support HTML5 audio
Definition | : | Bidirectional Encoder Representations from Transformers |
Category | : | Computing » Programming & Development |
Country/ Region |
: | Worldwide
|
Popularity | : |
|
Type | : |
Acronym
|
Bidirectional Encoder Representations from Transformers (BERT) is a Machine Learning (ML) model for Natural Language Processing (NLP) developed by Google. NLP is the field of Artificial Intelligence (AI) that aims for computers to read, analyze, interpret and derive meaning from text and spoken words.
BERT is based on Transformers, a deep learning model in which every output element is connected to every input element, and the weightings between them are dynamically calculated based on their connection. Transformers were first introduced by Google in 2017.
The full form of BERT is Bidirectional Encoder Representations from Transformers
Bidirectional Encoder Representations from Transformers