0 stringclasses 12 values | 1 float64 0 27.8k |
|---|---|
megatron.core.transformer.attention.forward.qkv | 506.477234 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.217504 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.134432 |
megatron.core.transformer.attention.forward.core_attention | 5,400.070313 |
megatron.core.transformer.attention.forward.linear_proj | 3.165408 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 5,912.947754 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 248.990341 |
megatron.core.transformer.mlp.forward.linear_fc1 | 1.21344 |
megatron.core.transformer.mlp.forward.activation | 179.953018 |
megatron.core.transformer.mlp.forward.linear_fc2 | 1.118848 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 183.261658 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.428896 |
megatron.core.transformer.attention.forward.qkv | 0.545184 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.073344 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.082304 |
megatron.core.transformer.attention.forward.core_attention | 1,195.512695 |
megatron.core.transformer.attention.forward.linear_proj | 0.114176 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 1,196.60498 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 0.030336 |
megatron.core.transformer.mlp.forward.linear_fc1 | 0.050272 |
megatron.core.transformer.mlp.forward.activation | 0.0104 |
megatron.core.transformer.mlp.forward.linear_fc2 | 0.142592 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 0.215616 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.03056 |
megatron.core.transformer.attention.forward.qkv | 312.78009 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.119968 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.09632 |
megatron.core.transformer.attention.forward.core_attention | 5,342.14209 |
megatron.core.transformer.attention.forward.linear_proj | 4.451456 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 5,660.700195 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 478.873444 |
megatron.core.transformer.mlp.forward.linear_fc1 | 5.067456 |
megatron.core.transformer.mlp.forward.activation | 163.929062 |
megatron.core.transformer.mlp.forward.linear_fc2 | 3.40096 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 173.106079 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 1.014464 |
megatron.core.transformer.attention.forward.qkv | 0.585312 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.083328 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.091776 |
megatron.core.transformer.attention.forward.core_attention | 1,048.786987 |
megatron.core.transformer.attention.forward.linear_proj | 4.055232 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 1,053.91626 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 0.093248 |
megatron.core.transformer.mlp.forward.linear_fc1 | 0.175488 |
megatron.core.transformer.mlp.forward.activation | 0.023136 |
megatron.core.transformer.mlp.forward.linear_fc2 | 0.398112 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 0.609408 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.093216 |
megatron.core.transformer.attention.forward.qkv | 0.037824 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.00304 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.003168 |
megatron.core.transformer.attention.forward.core_attention | 0.657856 |
megatron.core.transformer.attention.forward.linear_proj | 9.417536 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 10.138176 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 0.030336 |
megatron.core.transformer.mlp.forward.linear_fc1 | 0.05088 |
megatron.core.transformer.mlp.forward.activation | 0.01008 |
megatron.core.transformer.mlp.forward.linear_fc2 | 1.987616 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 2.060864 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.030464 |
megatron.core.transformer.attention.forward.qkv | 0.038176 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.003104 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.003168 |
megatron.core.transformer.attention.forward.core_attention | 0.673152 |
megatron.core.transformer.attention.forward.linear_proj | 5.16416 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 5.899616 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 0.030336 |
megatron.core.transformer.mlp.forward.linear_fc1 | 0.050496 |
megatron.core.transformer.mlp.forward.activation | 0.010144 |
megatron.core.transformer.mlp.forward.linear_fc2 | 1.528192 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 1.60064 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.030528 |
megatron.core.transformer.attention.forward.qkv | 0.100384 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.002976 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.002976 |
megatron.core.transformer.attention.forward.core_attention | 18.07456 |
megatron.core.transformer.attention.forward.linear_proj | 3.327328 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 21.527361 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 0.092672 |
megatron.core.transformer.mlp.forward.linear_fc1 | 0.171616 |
megatron.core.transformer.mlp.forward.activation | 0.021856 |
megatron.core.transformer.mlp.forward.linear_fc2 | 0.404352 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 0.609472 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.093088 |
megatron.core.transformer.attention.forward.qkv | 0.100096 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.00304 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.00304 |
megatron.core.transformer.attention.forward.core_attention | 7.075008 |
megatron.core.transformer.attention.forward.linear_proj | 2.95392 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 10.15344 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 0.092896 |
megatron.core.transformer.mlp.forward.linear_fc1 | 0.174848 |
megatron.core.transformer.mlp.forward.activation | 0.022496 |
megatron.core.transformer.mlp.forward.linear_fc2 | 0.401472 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 0.611072 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.09248 |
megatron.core.transformer.attention.forward.qkv | 0.037504 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.00304 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.003136 |
megatron.core.transformer.attention.forward.core_attention | 1,532.91626 |
End of preview. Expand
in Data Studio
No dataset card yet
- Downloads last month
- 8