I follow the document "QNN GenAI Transformer SDK Download and Utilization".
Download qualcomm_ai_engine_direct --version 2.19.0.240124, but I can't find qnn-genai-transformer-composer.
Could you help me?
I want to run tiny-llama on Snapdragon_8_gen_3, do you have some guide?
Hi,
May I know, what are you trying to do?
Is it about model conversion using qnn to run it on Snapdragon_8_gen_3?
Yes, I want to use model conversion of qnn to convert tiny LLAMA and run it on Snapdragon_8_gen_3
Is document "QNN GenAI Transformer SDK Download and Utilization" is not used?
Or do you have another guide to run LLM model on Snapdragon_8_gen_3?
Hi,
For model conversion using qnn you can check the below mentioned link,
https://docs.qualcomm.com/bundle/publicresource/topics/80-63442-50/tutor...
"I'm stuck with the same issue. Any thoughts on how to resolve this error?"
for command as the link "${QNN_SDK_ROOT}/bin/x86_64-linux-clang/qnn-pytorch-converter", how to assign --input_network --input_dim and --input_list?
"I'm stuck with the same issue. Any thoughts on how to resolve this error?"