Forums - LLM on-device with SNPE

2 posts / 0 new
Last post
LLM on-device with SNPE
victor.escorcia
Join Date: 2 Mar 23
Posts: 5
Posted: Fri, 2023-06-09 05:57

Dear team &/OR Community,

I bet that the LLM Tsunami has put a lot of people to work on porting more Large Language Models (LLMs) on-device (e.g., high-end mobile phones). Does anyone want to share some pointers involving ONNX & SNPE?

FYI: I file this issue in Hugging-Face/Optimum library as I failed porting a couple of (L)LMs provided there.
https://github.com/huggingface/optimum/issues/1092

Has anyone managed to export any Pytorch LLM onto onnx with opset version 9?
AFAIK that's the only opset version supported by SNPE. Kindly correct me if I'm wrong.

Thanks in advance for any help,
Victor

 

 

  • Up0
  • Down0
weihuan
Join Date: 12 Apr 20
Posts: 270
Posted: Fri, 2023-06-16 18:33

Dear developer,

We're glad to see that you have tried LLM on SNPE.

There is no limiation for opset version in Pytorch. Is any issues from SNPE converter?

BR.

Wei

  • Up0
  • Down0
or Register

Opinions expressed in the content posted here are the personal opinions of the original authors, and do not necessarily reflect those of Qualcomm Incorporated or its subsidiaries (“Qualcomm”). The content is provided for informational purposes only and is not meant to be an endorsement or representation by Qualcomm or any other party. This site may also provide links or references to non-Qualcomm sites and resources. Qualcomm makes no representations, warranties, or other commitments whatsoever about any non-Qualcomm sites or third-party resources that may be referenced, accessible from, or linked to this site.