1 Star 0 Fork 21

静观其变/yolo_v5_nnie

forked from shopping/yolo_v5_nnie 
加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
文件
克隆/下载
yolo_v5_sample.prototxt 26.97 KB
一键复制 编辑 原始数据 按行查看 历史
shopping 提交于 2020-11-30 08:06 . add yolo_v5_sample.prototxt.
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287288289290291292293294295296297298299300301302303304305306307308309310311312313314315316317318319320321322323324325326327328329330331332333334335336337338339340341342343344345346347348349350351352353354355356357358359360361362363364365366367368369370371372373374375376377378379380381382383384385386387388389390391392393394395396397398399400401402403404405406407408409410411412413414415416417418419420421422423424425426427428429430431432433434435436437438439440441442443444445446447448449450451452453454455456457458459460461462463464465466467468469470471472473474475476477478479480481482483484485486487488489490491492493494495496497498499500501502503504505506507508509510511512513514515516517518519520521522523524525526527528529530531532533534535536537538539540541542543544545546547548549550551552553554555556557558559560561562563564565566567568569570571572573574575576577578579580581582583584585586587588589590591592593594595596597598599600601602603604605606607608609610611612613614615616617618619620621622623624625626627628629630631632633634635636637638639640641642643644645646647648649650651652653654655656657658659660661662663664665666667668669670671672673674675676677678679680681682683684685686687688689690691692693694695696697698699700701702703704705706707708709710711712713714715716717718719720721722723724725726727728729730731732733734735736737738739740741742743744745746747748749750751752753754755756757758759760761762763764765766767768769770771772773774775776777778779780781782783784785786787788789790791792793794795796797798799800801802803804805806807808809810811812813814815816817818819820821822823824825826827828829830831832833834835836837838839840841842843844845846847848849850851852853854855856857858859860861862863864865866867868869870871872873874875876877878879880881882883884885886887888889890891892893894895896897898899900901902903904905906907908909910911912913914915916917918919920921922923924925926927928929930931932933934935936937938939940941942943944945946947948949950951952953954955956957958959960961962963964965966967968969970971972973974975976977978979980981982983984985986987988989990991992993994995996997998999100010011002100310041005100610071008100910101011101210131014101510161017101810191020102110221023102410251026102710281029103010311032103310341035103610371038103910401041104210431044104510461047104810491050105110521053105410551056105710581059106010611062106310641065106610671068106910701071107210731074107510761077107810791080108110821083108410851086108710881089109010911092109310941095109610971098109911001101110211031104110511061107110811091110111111121113111411151116111711181119112011211122112311241125112611271128112911301131113211331134113511361137113811391140114111421143114411451146114711481149115011511152115311541155115611571158115911601161116211631164116511661167116811691170117111721173117411751176117711781179118011811182118311841185118611871188118911901191119211931194119511961197119811991200120112021203120412051206120712081209121012111212121312141215121612171218121912201221122212231224122512261227122812291230123112321233123412351236123712381239124012411242124312441245124612471248124912501251125212531254125512561257125812591260126112621263126412651266126712681269127012711272127312741275127612771278127912801281128212831284128512861287128812891290129112921293129412951296129712981299130013011302130313041305130613071308130913101311131213131314131513161317131813191320132113221323132413251326132713281329133013311332133313341335133613371338133913401341134213431344134513461347134813491350135113521353135413551356135713581359136013611362136313641365136613671368136913701371137213731374137513761377137813791380138113821383138413851386138713881389139013911392139313941395139613971398139914001401140214031404140514061407140814091410141114121413141414151416141714181419142014211422142314241425142614271428142914301431143214331434143514361437143814391440144114421443144414451446144714481449145014511452145314541455145614571458145914601461146214631464146514661467146814691470147114721473147414751476147714781479148014811482148314841485148614871488148914901491149214931494149514961497149814991500150115021503150415051506150715081509151015111512151315141515151615171518151915201521152215231524152515261527152815291530153115321533153415351536153715381539154015411542154315441545154615471548154915501551155215531554155515561557155815591560156115621563156415651566156715681569157015711572157315741575157615771578157915801581158215831584158515861587158815891590159115921593159415951596159715981599160016011602160316041605160616071608160916101611161216131614161516161617161816191620162116221623162416251626162716281629163016311632163316341635163616371638163916401641164216431644164516461647164816491650165116521653165416551656165716581659166016611662166316641665166616671668166916701671167216731674167516761677167816791680168116821683168416851686168716881689169016911692169316941695169616971698169917001701170217031704170517061707170817091710171117121713171417151716171717181719172017211722172317241725172617271728172917301731173217331734173517361737173817391740174117421743174417451746174717481749175017511752175317541755175617571758175917601761176217631764176517661767176817691770177117721773177417751776177717781779178017811782178317841785178617871788178917901791179217931794179517961797179817991800180118021803180418051806180718081809181018111812181318141815181618171818181918201821182218231824182518261827182818291830183118321833183418351836
layer {
name: "images"
type: "Input"
top: "images"
input_param {
shape {
dim: 1
dim: 3
dim: 320
dim: 320
}
}
}
layer {
name: "Conv_0"
type: "Convolution"
bottom: "images"
top: "147"
convolution_param {
num_output: 16
bias_term: true
group: 1
pad_h: 1
pad_w: 1
kernel_h: 3
kernel_w: 3
stride_h: 2
stride_w: 2
dilation: 1
}
}
layer {
name: "Conv_1"
type: "Convolution"
bottom: "147"
top: "148"
convolution_param {
num_output: 32
bias_term: true
group: 1
pad_h: 1
pad_w: 1
kernel_h: 3
kernel_w: 3
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_2"
type: "ReLU"
bottom: "148"
top: "149"
}
layer {
name: "Conv_3"
type: "Convolution"
bottom: "149"
top: "150"
convolution_param {
num_output: 64
bias_term: true
group: 1
pad_h: 1
pad_w: 1
kernel_h: 3
kernel_w: 3
stride_h: 2
stride_w: 2
dilation: 1
}
}
layer {
name: "Relu_4"
type: "ReLU"
bottom: "150"
top: "151"
}
layer {
name: "Conv_5"
type: "Convolution"
bottom: "151"
top: "152"
convolution_param {
num_output: 32
bias_term: true
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_6"
type: "ReLU"
bottom: "152"
top: "153"
}
layer {
name: "Conv_7"
type: "Convolution"
bottom: "153"
top: "154"
convolution_param {
num_output: 64
bias_term: true
group: 1
pad_h: 1
pad_w: 1
kernel_h: 3
kernel_w: 3
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_8"
type: "ReLU"
bottom: "154"
top: "155"
}
layer {
name: "Add_9"
type: "Eltwise"
bottom: "151"
bottom: "155"
top: "156"
eltwise_param {
operation: SUM
}
}
layer {
name: "Conv_10"
type: "Convolution"
bottom: "156"
top: "157"
convolution_param {
num_output: 128
bias_term: true
group: 1
pad_h: 1
pad_w: 1
kernel_h: 3
kernel_w: 3
stride_h: 2
stride_w: 2
dilation: 1
}
}
layer {
name: "Relu_11"
type: "ReLU"
bottom: "157"
top: "158"
}
layer {
name: "Conv_12"
type: "Convolution"
bottom: "158"
top: "159"
convolution_param {
num_output: 64
bias_term: true
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_13"
type: "ReLU"
bottom: "159"
top: "160"
}
layer {
name: "Conv_14"
type: "Convolution"
bottom: "160"
top: "161"
convolution_param {
num_output: 64
bias_term: true
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_15"
type: "ReLU"
bottom: "161"
top: "162"
}
layer {
name: "Conv_16"
type: "Convolution"
bottom: "162"
top: "163"
convolution_param {
num_output: 64
bias_term: true
group: 1
pad_h: 1
pad_w: 1
kernel_h: 3
kernel_w: 3
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_17"
type: "ReLU"
bottom: "163"
top: "164"
}
layer {
name: "Add_18"
type: "Eltwise"
bottom: "160"
bottom: "164"
top: "165"
eltwise_param {
operation: SUM
}
}
layer {
name: "Conv_19"
type: "Convolution"
bottom: "165"
top: "166"
convolution_param {
num_output: 64
bias_term: true
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_20"
type: "ReLU"
bottom: "166"
top: "167"
}
layer {
name: "Conv_21"
type: "Convolution"
bottom: "167"
top: "168"
convolution_param {
num_output: 64
bias_term: true
group: 1
pad_h: 1
pad_w: 1
kernel_h: 3
kernel_w: 3
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_22"
type: "ReLU"
bottom: "168"
top: "169"
}
layer {
name: "Add_23"
type: "Eltwise"
bottom: "165"
bottom: "169"
top: "170"
eltwise_param {
operation: SUM
}
}
layer {
name: "Conv_24"
type: "Convolution"
bottom: "170"
top: "171"
convolution_param {
num_output: 64
bias_term: true
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_25"
type: "ReLU"
bottom: "171"
top: "172"
}
layer {
name: "Conv_26"
type: "Convolution"
bottom: "172"
top: "173"
convolution_param {
num_output: 64
bias_term: true
group: 1
pad_h: 1
pad_w: 1
kernel_h: 3
kernel_w: 3
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_27"
type: "ReLU"
bottom: "173"
top: "174"
}
layer {
name: "Add_28"
type: "Eltwise"
bottom: "170"
bottom: "174"
top: "175"
eltwise_param {
operation: SUM
}
}
layer {
name: "Conv_29"
type: "Convolution"
bottom: "175"
top: "176"
convolution_param {
num_output: 64
bias_term: false
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Conv_30"
type: "Convolution"
bottom: "158"
top: "177"
convolution_param {
num_output: 64
bias_term: false
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Concat_31"
type: "Concat"
bottom: "176"
bottom: "177"
top: "178"
concat_param {
axis: 1
}
}
layer {
name: "BatchNormalization_32_bn"
type: "BatchNorm"
bottom: "178"
top: "179"
batch_norm_param {
use_global_stats: true
eps: 9.999999747378752e-05
}
}
layer {
name: "BatchNormalization_32"
type: "Scale"
bottom: "179"
top: "179"
scale_param {
bias_term: true
}
}
layer {
name: "Relu_33"
type: "ReLU"
bottom: "179"
top: "180"
}
layer {
name: "Conv_34"
type: "Convolution"
bottom: "180"
top: "181"
convolution_param {
num_output: 128
bias_term: true
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_35"
type: "ReLU"
bottom: "181"
top: "182"
}
layer {
name: "Conv_36"
type: "Convolution"
bottom: "182"
top: "183"
convolution_param {
num_output: 256
bias_term: true
group: 1
pad_h: 1
pad_w: 1
kernel_h: 3
kernel_w: 3
stride_h: 2
stride_w: 2
dilation: 1
}
}
layer {
name: "Relu_37"
type: "ReLU"
bottom: "183"
top: "184"
}
layer {
name: "Conv_38"
type: "Convolution"
bottom: "184"
top: "185"
convolution_param {
num_output: 128
bias_term: true
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_39"
type: "ReLU"
bottom: "185"
top: "186"
}
layer {
name: "Conv_40"
type: "Convolution"
bottom: "186"
top: "187"
convolution_param {
num_output: 128
bias_term: true
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_41"
type: "ReLU"
bottom: "187"
top: "188"
}
layer {
name: "Conv_42"
type: "Convolution"
bottom: "188"
top: "189"
convolution_param {
num_output: 128
bias_term: true
group: 1
pad_h: 1
pad_w: 1
kernel_h: 3
kernel_w: 3
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_43"
type: "ReLU"
bottom: "189"
top: "190"
}
layer {
name: "Add_44"
type: "Eltwise"
bottom: "186"
bottom: "190"
top: "191"
eltwise_param {
operation: SUM
}
}
layer {
name: "Conv_45"
type: "Convolution"
bottom: "191"
top: "192"
convolution_param {
num_output: 128
bias_term: true
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_46"
type: "ReLU"
bottom: "192"
top: "193"
}
layer {
name: "Conv_47"
type: "Convolution"
bottom: "193"
top: "194"
convolution_param {
num_output: 128
bias_term: true
group: 1
pad_h: 1
pad_w: 1
kernel_h: 3
kernel_w: 3
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_48"
type: "ReLU"
bottom: "194"
top: "195"
}
layer {
name: "Add_49"
type: "Eltwise"
bottom: "191"
bottom: "195"
top: "196"
eltwise_param {
operation: SUM
}
}
layer {
name: "Conv_50"
type: "Convolution"
bottom: "196"
top: "197"
convolution_param {
num_output: 128
bias_term: true
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_51"
type: "ReLU"
bottom: "197"
top: "198"
}
layer {
name: "Conv_52"
type: "Convolution"
bottom: "198"
top: "199"
convolution_param {
num_output: 128
bias_term: true
group: 1
pad_h: 1
pad_w: 1
kernel_h: 3
kernel_w: 3
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_53"
type: "ReLU"
bottom: "199"
top: "200"
}
layer {
name: "Add_54"
type: "Eltwise"
bottom: "196"
bottom: "200"
top: "201"
eltwise_param {
operation: SUM
}
}
layer {
name: "Conv_55"
type: "Convolution"
bottom: "201"
top: "202"
convolution_param {
num_output: 128
bias_term: false
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Conv_56"
type: "Convolution"
bottom: "184"
top: "203"
convolution_param {
num_output: 128
bias_term: false
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Concat_57"
type: "Concat"
bottom: "202"
bottom: "203"
top: "204"
concat_param {
axis: 1
}
}
layer {
name: "BatchNormalization_58_bn"
type: "BatchNorm"
bottom: "204"
top: "205"
batch_norm_param {
use_global_stats: true
eps: 9.999999747378752e-05
}
}
layer {
name: "BatchNormalization_58"
type: "Scale"
bottom: "205"
top: "205"
scale_param {
bias_term: true
}
}
layer {
name: "Relu_59"
type: "ReLU"
bottom: "205"
top: "206"
}
layer {
name: "Conv_60"
type: "Convolution"
bottom: "206"
top: "207"
convolution_param {
num_output: 256
bias_term: true
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_61"
type: "ReLU"
bottom: "207"
top: "208"
}
layer {
name: "Conv_62"
type: "Convolution"
bottom: "208"
top: "209"
convolution_param {
num_output: 512
bias_term: true
group: 1
pad_h: 1
pad_w: 1
kernel_h: 3
kernel_w: 3
stride_h: 2
stride_w: 2
dilation: 1
}
}
layer {
name: "Relu_63"
type: "ReLU"
bottom: "209"
top: "210"
}
layer {
name: "Conv_64"
type: "Convolution"
bottom: "210"
top: "211"
convolution_param {
num_output: 256
bias_term: true
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_65"
type: "ReLU"
bottom: "211"
top: "212"
}
layer {
name: "MaxPool_66"
type: "Pooling"
bottom: "212"
top: "213"
pooling_param {
pool: MAX
kernel_h: 5
kernel_w: 5
stride_h: 1
stride_w: 1
pad_h: 2
pad_w: 2
}
}
layer {
name: "MaxPool_67"
type: "Pooling"
bottom: "212"
top: "214"
pooling_param {
pool: MAX
kernel_h: 9
kernel_w: 9
stride_h: 1
stride_w: 1
pad_h: 4
pad_w: 4
}
}
layer {
name: "MaxPool_68"
type: "Pooling"
bottom: "212"
top: "215"
pooling_param {
pool: MAX
kernel_h: 13
kernel_w: 13
stride_h: 1
stride_w: 1
pad_h: 6
pad_w: 6
}
}
layer {
name: "Concat_69"
type: "Concat"
bottom: "212"
bottom: "213"
bottom: "214"
bottom: "215"
top: "216"
concat_param {
axis: 1
}
}
layer {
name: "Conv_70"
type: "Convolution"
bottom: "216"
top: "217"
convolution_param {
num_output: 512
bias_term: true
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_71"
type: "ReLU"
bottom: "217"
top: "218"
}
layer {
name: "Conv_72"
type: "Convolution"
bottom: "218"
top: "219"
convolution_param {
num_output: 256
bias_term: true
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_73"
type: "ReLU"
bottom: "219"
top: "220"
}
layer {
name: "Conv_74"
type: "Convolution"
bottom: "220"
top: "221"
convolution_param {
num_output: 256
bias_term: true
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_75"
type: "ReLU"
bottom: "221"
top: "222"
}
layer {
name: "Conv_76"
type: "Convolution"
bottom: "222"
top: "223"
convolution_param {
num_output: 256
bias_term: true
group: 1
pad_h: 1
pad_w: 1
kernel_h: 3
kernel_w: 3
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_77"
type: "ReLU"
bottom: "223"
top: "224"
}
layer {
name: "Add_78"
type: "Eltwise"
bottom: "220"
bottom: "224"
top: "225"
eltwise_param {
operation: SUM
}
}
layer {
name: "Conv_79"
type: "Convolution"
bottom: "225"
top: "226"
convolution_param {
num_output: 256
bias_term: true
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_80"
type: "ReLU"
bottom: "226"
top: "227"
}
layer {
name: "Conv_81"
type: "Convolution"
bottom: "227"
top: "228"
convolution_param {
num_output: 256
bias_term: true
group: 1
pad_h: 1
pad_w: 1
kernel_h: 3
kernel_w: 3
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_82"
type: "ReLU"
bottom: "228"
top: "229"
}
layer {
name: "Add_83"
type: "Eltwise"
bottom: "225"
bottom: "229"
top: "230"
eltwise_param {
operation: SUM
}
}
layer {
name: "Conv_84"
type: "Convolution"
bottom: "230"
top: "231"
convolution_param {
num_output: 256
bias_term: false
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Conv_85"
type: "Convolution"
bottom: "218"
top: "232"
convolution_param {
num_output: 256
bias_term: false
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Concat_86"
type: "Concat"
bottom: "231"
bottom: "232"
top: "233"
concat_param {
axis: 1
}
}
layer {
name: "BatchNormalization_87_bn"
type: "BatchNorm"
bottom: "233"
top: "234"
batch_norm_param {
use_global_stats: true
eps: 9.999999747378752e-05
}
}
layer {
name: "BatchNormalization_87"
type: "Scale"
bottom: "234"
top: "234"
scale_param {
bias_term: true
}
}
layer {
name: "Relu_88"
type: "ReLU"
bottom: "234"
top: "235"
}
layer {
name: "Conv_89"
type: "Convolution"
bottom: "235"
top: "236"
convolution_param {
num_output: 512
bias_term: true
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_90"
type: "ReLU"
bottom: "236"
top: "237"
}
layer {
name: "Conv_91"
type: "Convolution"
bottom: "237"
top: "238"
convolution_param {
num_output: 256
bias_term: true
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_92"
type: "ReLU"
bottom: "238"
top: "239"
}
layer {
name: "Conv_93"
type: "Convolution"
bottom: "239"
top: "240"
convolution_param {
num_output: 256
bias_term: true
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_94"
type: "ReLU"
bottom: "240"
top: "241"
}
layer {
name: "Conv_95"
type: "Convolution"
bottom: "241"
top: "242"
convolution_param {
num_output: 256
bias_term: true
group: 1
pad_h: 1
pad_w: 1
kernel_h: 3
kernel_w: 3
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_96"
type: "ReLU"
bottom: "242"
top: "243"
}
layer {
name: "Conv_97"
type: "Convolution"
bottom: "243"
top: "244"
convolution_param {
num_output: 256
bias_term: false
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Conv_98"
type: "Convolution"
bottom: "237"
top: "245"
convolution_param {
num_output: 256
bias_term: false
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Concat_99"
type: "Concat"
bottom: "244"
bottom: "245"
top: "246"
concat_param {
axis: 1
}
}
layer {
name: "BatchNormalization_100_bn"
type: "BatchNorm"
bottom: "246"
top: "247"
batch_norm_param {
use_global_stats: true
eps: 9.999999747378752e-05
}
}
layer {
name: "BatchNormalization_100"
type: "Scale"
bottom: "247"
top: "247"
scale_param {
bias_term: true
}
}
layer {
name: "Relu_101"
type: "ReLU"
bottom: "247"
top: "248"
}
layer {
name: "Conv_102"
type: "Convolution"
bottom: "248"
top: "249"
convolution_param {
num_output: 512
bias_term: true
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_103"
type: "ReLU"
bottom: "249"
top: "250"
}
layer {
name: "Conv_104"
type: "Convolution"
bottom: "250"
top: "251"
convolution_param {
num_output: 159
bias_term: true
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "ConvTranspose_105"
type: "Deconvolution"
bottom: "250"
top: "252"
convolution_param {
num_output: 512
bias_term: true
group: 512
pad_h: 0
pad_w: 0
kernel_h: 2
kernel_w: 2
stride_h: 2
stride_w: 2
}
}
layer {
name: "Concat_106"
type: "Concat"
bottom: "252"
bottom: "208"
top: "253"
concat_param {
axis: 1
}
}
layer {
name: "Conv_107"
type: "Convolution"
bottom: "253"
top: "254"
convolution_param {
num_output: 256
bias_term: true
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_108"
type: "ReLU"
bottom: "254"
top: "255"
}
layer {
name: "Conv_109"
type: "Convolution"
bottom: "255"
top: "256"
convolution_param {
num_output: 128
bias_term: true
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_110"
type: "ReLU"
bottom: "256"
top: "257"
}
layer {
name: "Conv_111"
type: "Convolution"
bottom: "257"
top: "258"
convolution_param {
num_output: 128
bias_term: true
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_112"
type: "ReLU"
bottom: "258"
top: "259"
}
layer {
name: "Conv_113"
type: "Convolution"
bottom: "259"
top: "260"
convolution_param {
num_output: 128
bias_term: true
group: 1
pad_h: 1
pad_w: 1
kernel_h: 3
kernel_w: 3
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_114"
type: "ReLU"
bottom: "260"
top: "261"
}
layer {
name: "Conv_115"
type: "Convolution"
bottom: "261"
top: "262"
convolution_param {
num_output: 128
bias_term: false
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Conv_116"
type: "Convolution"
bottom: "255"
top: "263"
convolution_param {
num_output: 128
bias_term: false
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Concat_117"
type: "Concat"
bottom: "262"
bottom: "263"
top: "264"
concat_param {
axis: 1
}
}
layer {
name: "BatchNormalization_118_bn"
type: "BatchNorm"
bottom: "264"
top: "265"
batch_norm_param {
use_global_stats: true
eps: 9.999999747378752e-05
}
}
layer {
name: "BatchNormalization_118"
type: "Scale"
bottom: "265"
top: "265"
scale_param {
bias_term: true
}
}
layer {
name: "Relu_119"
type: "ReLU"
bottom: "265"
top: "266"
}
layer {
name: "Conv_120"
type: "Convolution"
bottom: "266"
top: "267"
convolution_param {
num_output: 256
bias_term: true
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_121"
type: "ReLU"
bottom: "267"
top: "268"
}
layer {
name: "Conv_122"
type: "Convolution"
bottom: "268"
top: "269"
convolution_param {
num_output: 159
bias_term: true
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "ConvTranspose_123"
type: "Deconvolution"
bottom: "268"
top: "270"
convolution_param {
num_output: 256
bias_term: true
group: 256
pad_h: 0
pad_w: 0
kernel_h: 2
kernel_w: 2
stride_h: 2
stride_w: 2
}
}
layer {
name: "Concat_124"
type: "Concat"
bottom: "270"
bottom: "182"
top: "271"
concat_param {
axis: 1
}
}
layer {
name: "Conv_125"
type: "Convolution"
bottom: "271"
top: "272"
convolution_param {
num_output: 128
bias_term: true
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_126"
type: "ReLU"
bottom: "272"
top: "273"
}
layer {
name: "Conv_127"
type: "Convolution"
bottom: "273"
top: "274"
convolution_param {
num_output: 64
bias_term: true
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_128"
type: "ReLU"
bottom: "274"
top: "275"
}
layer {
name: "Conv_129"
type: "Convolution"
bottom: "275"
top: "276"
convolution_param {
num_output: 64
bias_term: true
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_130"
type: "ReLU"
bottom: "276"
top: "277"
}
layer {
name: "Conv_131"
type: "Convolution"
bottom: "277"
top: "278"
convolution_param {
num_output: 64
bias_term: true
group: 1
pad_h: 1
pad_w: 1
kernel_h: 3
kernel_w: 3
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_132"
type: "ReLU"
bottom: "278"
top: "279"
}
layer {
name: "Conv_133"
type: "Convolution"
bottom: "279"
top: "280"
convolution_param {
num_output: 64
bias_term: false
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Conv_134"
type: "Convolution"
bottom: "273"
top: "281"
convolution_param {
num_output: 64
bias_term: false
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Concat_135"
type: "Concat"
bottom: "280"
bottom: "281"
top: "282"
concat_param {
axis: 1
}
}
layer {
name: "BatchNormalization_136_bn"
type: "BatchNorm"
bottom: "282"
top: "283"
batch_norm_param {
use_global_stats: true
eps: 9.999999747378752e-05
}
}
layer {
name: "BatchNormalization_136"
type: "Scale"
bottom: "283"
top: "283"
scale_param {
bias_term: true
}
}
layer {
name: "Relu_137"
type: "ReLU"
bottom: "283"
top: "284"
}
layer {
name: "Conv_138"
type: "Convolution"
bottom: "284"
top: "285"
convolution_param {
num_output: 128
bias_term: true
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Relu_139"
type: "ReLU"
bottom: "285"
top: "286"
}
layer {
name: "Conv_140"
type: "Convolution"
bottom: "286"
top: "287"
convolution_param {
num_output: 159
bias_term: true
group: 1
pad_h: 0
pad_w: 0
kernel_h: 1
kernel_w: 1
stride_h: 1
stride_w: 1
dilation: 1
}
}
layer {
name: "Reshape_154"
type: "Reshape"
bottom: "287"
top: "305"
reshape_param {
shape {
dim: 0
dim: 3
dim: 53
dim: 1600
}
}
}
layer {
name: "Reshape_169"
type: "Reshape"
bottom: "269"
top: "324"
reshape_param {
shape {
dim: 0
dim: 3
dim: 53
dim: 400
}
}
}
layer {
name: "Reshape_184"
type: "Reshape"
bottom: "251"
top: "343"
reshape_param {
shape {
dim: 0
dim: 3
dim: 53
dim: 100
}
}
}
马建仓 AI 助手
尝试更多
代码解读
代码找茬
代码优化
C
1
https://gitee.com/Jundy/yolo_v5_nnie.git
git@gitee.com:Jundy/yolo_v5_nnie.git
Jundy
yolo_v5_nnie
yolo_v5_nnie
master

搜索帮助