2026 Best Software Awards are here!See the list
BERT Base

By Amazon Web Services (AWS)

Unclaimed Profile

Claim your company’s G2 profile

Claiming this profile confirms that you work at BERT Base and allows you to manage how it appears on G2.

    Once approved, you can:

  • Update your company and product details

  • Boost your brand's visibility on G2, search and LLMs

  • Access insights on visitors and competitors

  • Respond to customer reviews

  • We’ll verify your work email before granting access.

Claim Now
4.1 out of 5 stars
3 star
0%
2 star
0%
1 star
0%

How would you rate your experience with BERT Base?

It's been two months since this profile received a new review
Leave a Review

BERT Base Reviews & Product Details

Product Avatar Image

Have you used BERT Base before?

Answer a few questions to help the BERT Base community

BERT Base Reviews (5)

Reviews

BERT Base Reviews (5)

4.1
5 reviews
Search reviews
Filter Reviews
Clear Results
G2 reviews are authentic and verified.
Rishika J.
RJ
Software Engineer II
Mid-Market (51-1000 emp.)
"BERT: A question answering model by PyTorch"
What do you like best about BERT Base?

One of the best parts about this particular PyTorch transformer is its support for more than 100 languages. BERT is integrated with the most efficient neural networks, training objectives, and transfer learning. It is a pre-trained model with highly accurate tuning trained on different datasets available like SQUAD. It answers the questions concisely and even helps in other use cases like highlighting paragraphs with crucial entry points when a question is asked. Review collected by and hosted on G2.com.

What do you dislike about BERT Base?

The accuracy and the vast support for large datasets for different languages make BERT Base Multilingual Uncased PyTorch Hub Extractive Question Answering an expensive model. Due to the large dataset, this model is a bit slow to train, requires updating a lot of weights, and takes more computation time. Review collected by and hosted on G2.com.

Jagadis P.
JP
Product Specialist (Order to Cash)
Enterprise (> 1000 emp.)
"Mastering your setup with PyTorch - Master piece"
What do you like best about BERT Base?

Pytorch BERT is one of the most Extractive question answering tools which is created on a text embedding idealogy. This takes as input a pair of question-setup strings and returns a related contextual sub-module string that more or less matches the exact context of the real answer to the question. The best part of this setup is it is based on a pre-trained on multilingual setup which helps in returning question-context strings. Review collected by and hosted on G2.com.

What do you dislike about BERT Base?

AI & ML is doing marvelous jobs but still, we have not achieved the level that we want. Sometimes it acts weird by returning an answer or string which is related to the question in a vocabulary way but not a contextual way. This can be set aside as an exception because these are very rare instances where your utterances are not set properly. Review collected by and hosted on G2.com.

Tarang N.
TN
Systems Associate - Trainee
Mid-Market (51-1000 emp.)
"BERT: A unicase for Multilingual Base Model"
What do you like best about BERT Base?

BERT Base Multilingual Uncased PyTorch Hub is a transformer model as it helps the computer to understand the multilingual data of different languages into one unicase form and predict the next sentence for the betterment with the help of artificial intelligence and then randomly masking some part of the words and run it to complete the whole sentence. Review collected by and hosted on G2.com.

What do you dislike about BERT Base?

There isn't any thing that I don't like about BERT Base Multilingual but it is primarily worked for fine tuned on task that use the whole sentence to make decision and sequence classification. Review collected by and hosted on G2.com.

Verified User in Information Technology and Services
UI
Small-Business (50 or fewer emp.)
"Natural language processing Model"
What do you like best about BERT Base?

BERT is a multilingual base model, it is trained over 102 languages. The advantage of the model is that it is uncased. One can easily access it using pytorch library. The model aims to fine tuned the tasks which depends on whole sentences. Review collected by and hosted on G2.com.

What do you dislike about BERT Base?

The model seems to be quite efficient and effective. Didn't find any drawback Review collected by and hosted on G2.com.

Verified User in Information Technology and Services
II
Enterprise (> 1000 emp.)
"BERT BASE - Works Perfectly well"
What do you like best about BERT Base?

Language Model Tokenizer. It works well with all sets of data and in all generic industries. Review collected by and hosted on G2.com.

What do you dislike about BERT Base?

Difficult to accomplish tasks in a limited time. Time consuming. Review collected by and hosted on G2.com.

Pricing

Pricing details for this product isn’t currently available. Visit the vendor’s website to learn more.

Product Avatar Image
BERT Base