1 Star 0 Fork 68

pyxiiin/cann-ops-adv

forked from Ascend/cann-ops-adv 
加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
文件
克隆/下载
blacklist.txt 5.65 KB
一键复制 编辑 原始数据 按行查看 历史
src/transformer/flash_attention_score/flash_attention_score.cpp
src/transformer/flash_attention_score/ophost/flash_attention_score_def.cpp
src/transformer/flash_attention_score/ophost/flash_attention_score_proto.cpp
src/transformer/flash_attention_score_grad/flash_attention_score_grad.cpp
src/transformer/flash_attention_score_grad/ophost/flash_attention_score_grad_def.cpp
src/transformer/flash_attention_score_grad/ophost/flash_attention_score_grad_proto.cpp
src/utils/inc/fallback_comm.h
src/utils/inc/fallback.h
src/utils/src/fallback_comm.cpp
src/transformer/ffn/ophost/ffn_proto.cpp
src/transformer/ffn/ophost/aclnn_ffn.h
src/transformer/ffn/ophost/aclnn_ffn_v2.h
src/transformer/ffn/ophost/aclnn_ffn_v3.h
src/transformer/ffn/ophost/aclnn_ffn.cpp
src/transformer/ffn/ophost/ffn.h
src/transformer/ffn/ophost/ffn.cpp
src/transformer/ffn/ophost/fallback_ffn.cpp
src/transformer/ffn/ophost/ffn_def.cpp
src/transformer/ffn/ffn_nonquant_nz.h
src/transformer/incre_flash_attention/ifa_public_define.h
src/transformer/incre_flash_attention/incre_flash_attention.cpp
src/transformer/incre_flash_attention/incre_flash_attention_allvec_new.h
src/transformer/incre_flash_attention/incre_flash_attention_split_Bbn2s2_Us2.h
src/transformer/incre_flash_attention/ophost/aclnn_incre_flash_attention.cpp
src/transformer/incre_flash_attention/ophost/aclnn_incre_flash_attention.h
src/transformer/incre_flash_attention/ophost/aclnn_incre_flash_attention_v2.cpp
src/transformer/incre_flash_attention/ophost/aclnn_incre_flash_attention_v3.cpp
src/transformer/incre_flash_attention/ophost/aclnn_incre_flash_attention_v3.h
src/transformer/incre_flash_attention/ophost/aclnn_incre_flash_attention_v4.cpp
src/transformer/incre_flash_attention/ophost/aclnn_incre_flash_attention_v4.h
src/transformer/incre_flash_attention/ophost/fallback_incre_flash_attention.cpp
src/transformer/incre_flash_attention/ophost/incre_flash_attention_def.cpp
src/transformer/incre_flash_attention/ophost/incre_flash_attention_tiling.cc
src/transformer/incre_flash_attention/ophost/incre_flash_attention_tiling.h
src/transformer/incre_flash_attention/ophost/incre_flash_attention_tiling_register.cc
src/transformer/prompt_flash_attention/kernel_data_copy_transpose.h
src/transformer/prompt_flash_attention/kernel_operator_softmax_compute_nz.h
src/transformer/prompt_flash_attention/ophost/aclnn_prompt_flash_attention.cpp
src/transformer/prompt_flash_attention/ophost/aclnn_prompt_flash_attention.h
src/transformer/prompt_flash_attention/ophost/aclnn_prompt_flash_attention_inner.cpp
src/transformer/prompt_flash_attention/ophost/aclnn_prompt_flash_attention_inner.h
src/transformer/prompt_flash_attention/ophost/aclnn_prompt_flash_attention_v2.cpp
src/transformer/prompt_flash_attention/ophost/aclnn_prompt_flash_attention_v2.h
src/transformer/prompt_flash_attention/ophost/aclnn_prompt_flash_attention_v3.cpp
src/transformer/prompt_flash_attention/ophost/aclnn_prompt_flash_attention_v3.h
src/transformer/prompt_flash_attention/ophost/fallback_prompt_flash_attention.cpp
src/transformer/prompt_flash_attention/ophost/prompt_flash_attention_def.cpp
src/transformer/prompt_flash_attention/ophost/prompt_flash_attention.h
src/transformer/prompt_flash_attention/ophost/prompt_flash_attention_proto.cpp
src/transformer/prompt_flash_attention/ophost/prompt_flash_attention_base_aclnn.cpp
src/transformer/prompt_flash_attention/ophost/prompt_flash_attention_tiling.cpp
src/transformer/prompt_flash_attention/ophost/prompt_flash_attention_tiling.h
src/transformer/prompt_flash_attention/ophost/prompt_flash_attention_tiling_register.cc
src/transformer/prompt_flash_attention/prompt_flash_attention.cpp
src/transformer/prompt_flash_attention/prompt_flash_attention_base.h
src/transformer/prompt_flash_attention/prompt_flash_attention_bnstilling_n_s_no_tail.h
src/transformer/prompt_flash_attention/prompt_flash_attention_bnstilling_n_s_no_tailWBNSD.h
src/transformer/prompt_flash_attention/prompt_flash_attention_bnstilling_n_s_no_tailWBNSD_KV_NZ.h
src/transformer/prompt_flash_attention/prompt_flash_attention_bnstilling_n_s_tail.h
src/transformer/prompt_flash_attention/prompt_flash_attention_bnstilling_n_s_tailWBNSD.h
src/transformer/prompt_flash_attention/prompt_flash_attention_empty_tensor.h
src/transformer/prompt_flash_attention/prompt_flash_attention_nz_kv_base.h
src/transformer/prompt_flash_attention/prompt_flash_attention_s1s2_bns1_x310.h
src/transformer/prompt_flash_attention/prompt_flash_attention_s1s2_bns1_x310_base.h
src/transformer/prompt_flash_attention/prompt_flash_attention_s1s2_bns1_x910.h
src/transformer/prompt_flash_attention/prompt_flash_attention_s1s2_bns1_x910_base.h
src/transformer/prompt_flash_attention/prompt_flash_attention_split_n_s_no_tail.h
src/transformer/prompt_flash_attention/prompt_flash_attention_split_n_s_tail.h
src/transformer/fused_infer_attention_score/ophost/aclnn_fused_infer_attention_score_v2.h
src/transformer/fused_infer_attention_score/ophost/aclnn_fused_infer_attention_score_v2.cpp
src/transformer/fused_infer_attention_score/ophost/aclnn_fused_infer_attention_score.h
src/transformer/fused_infer_attention_score/ophost/aclnn_fused_infer_attention_score.cpp
src/transformer/fused_infer_attention_score/ophost/fused_infer_attention_score_def.cpp
src/transformer/fused_infer_attention_score/ophost/fused_infer_attention_score_tiling_register.cpp
src/transformer/fused_infer_attention_score/ophost/fused_infer_attention_score_tiling.cpp
src/transformer/fused_infer_attention_score/ophost/fused_infer_attention_score_tiling.h
src/transformer/fused_infer_attention_score/ophost/fallback_fused_infer_attention_score.cpp
src/transformer/fused_infer_attention_score/fused_infer_attention_score.cpp
src/transformer/incre_flash_attention/ophost/aclnn_incre_flash_attention_v2.h
马建仓 AI 助手
尝试更多
代码解读
代码找茬
代码优化
C++
1
https://gitee.com/pyxin/cann-ops-adv.git
git@gitee.com:pyxin/cann-ops-adv.git
pyxin
cann-ops-adv
cann-ops-adv
master

搜索帮助