Hi all,

Sorry in advance for my naivety. I've been following this guide to fine-tune a 
BERT model for the MRPC dataset: 
[https://nlp.gluon.ai/examples/sentence_embedding/bert.html#Data-preprocessing-for-BERT](https://nlp.gluon.ai/examples/sentence_embedding/bert.html#Data-preprocessing-for-BERT)

The training and other steps work flawlessly, with a decent level of accuracy.

However, I can't seem to find any documentation on how to use my exported model 
for inference?

I'd like to do something as outlined below (obviously pseudocode)...

    from bert.deploy import infer 

    model = /path/to/model

    sentence1 = "The apple fell from the tree"
    sentence2 = "The pear fell from the tree"

    result = infer(model, sentence1, sentence2)

    print(result) # where 0 = not a paraphrase and 1 = is a paraphrase


What would be the best way to go about this? Thanks in advance!!





---
[Visit 
Topic](https://discuss.mxnet.apache.org/t/deploy-and-infer-on-bert-mrpc-model/6857/1)
 or reply to this email to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click 
here](https://discuss.mxnet.apache.org/email/unsubscribe/8783175556b7ecc03aa0f69322e55d7a9d88d8491adc6222887847fcaa1f398a).

Reply via email to