ampere_research/pytorch/output/altra_2_2_p2p-Gnutella25_100.output

154 lines
9.4 KiB
Plaintext
Raw Normal View History

2024-12-03 00:20:09 -05:00
srun: Job time limit was unset; set to partition default of 60 minutes
srun: ################################################################################
srun: # Please note that the oasis compute nodes have aarch64 architecture CPUs. #
srun: # All submission nodes and all other compute nodes have x86_64 architecture #
srun: # CPUs. Programs, environments, or other software that was built on x86_64 #
srun: # nodes may need to be rebuilt to properly execute on these nodes. #
srun: ################################################################################
srun: job 3394140 queued and waiting for resources
srun: job 3394140 has been allocated resources
/nfshomes/vut/ampere_research/pytorch/spmv.py:20: UserWarning: Sparse CSR tensor support is in beta state. If you miss a functionality in the sparse tensor support, please submit a feature request to https://github.com/pytorch/pytorch/issues. (Triggered internally at /space/jenkins/workspace/Releases/pytorch-dls/pytorch-dls/aten/src/ATen/SparseCsrTensorImpl.cpp:55.)
).to_sparse_csr().type(torch.float)
tensor(crow_indices=tensor([ 0, 9, 9, ..., 54704, 54704, 54705]),
col_indices=tensor([ 1, 2, 3, ..., 17949, 22685, 144]),
values=tensor([1., 1., 1., ..., 1., 1., 1.]), size=(22687, 22687),
nnz=54705, layout=torch.sparse_csr)
tensor([0.8199, 0.9849, 0.4642, ..., 0.7594, 0.3568, 0.4020])
Shape: torch.Size([22687, 22687])
NNZ: 54705
Density: 0.00010628522108964806
Time: 0.19272208213806152 seconds
Performance counter stats for 'apptainer run pytorch-altra.sif -c numactl --cpunodebind=0 --membind=0 python spmv.py matrices/p2p-Gnutella25.mtx 100':
64.71 msec task-clock:u # 0.018 CPUs utilized
0 context-switches:u # 0.000 /sec
0 cpu-migrations:u # 0.000 /sec
3,319 page-faults:u # 51.288 K/sec
57,611,295 cycles:u # 0.890 GHz (39.00%)
83,148,228 instructions:u # 1.44 insn per cycle (82.73%)
<not supported> branches:u
375,111 branch-misses:u
32,759,228 L1-dcache-loads:u # 506.221 M/sec
475,086 L1-dcache-load-misses:u # 1.45% of all L1-dcache accesses
<not supported> LLC-loads:u
<not supported> LLC-load-misses:u
31,366,158 L1-icache-loads:u # 484.694 M/sec
297,293 L1-icache-load-misses:u # 0.95% of all L1-icache accesses
35,611,781 dTLB-loads:u # 550.301 M/sec (25.73%)
<not counted> dTLB-load-misses:u (0.00%)
<not counted> iTLB-loads:u (0.00%)
<not counted> iTLB-load-misses:u (0.00%)
3.578384817 seconds time elapsed
14.435258000 seconds user
27.700836000 seconds sys
/nfshomes/vut/ampere_research/pytorch/spmv.py:20: UserWarning: Sparse CSR tensor support is in beta state. If you miss a functionality in the sparse tensor support, please submit a feature request to https://github.com/pytorch/pytorch/issues. (Triggered internally at /space/jenkins/workspace/Releases/pytorch-dls/pytorch-dls/aten/src/ATen/SparseCsrTensorImpl.cpp:55.)
).to_sparse_csr().type(torch.float)
tensor(crow_indices=tensor([ 0, 9, 9, ..., 54704, 54704, 54705]),
col_indices=tensor([ 1, 2, 3, ..., 17949, 22685, 144]),
values=tensor([1., 1., 1., ..., 1., 1., 1.]), size=(22687, 22687),
nnz=54705, layout=torch.sparse_csr)
tensor([0.0069, 0.9904, 0.5316, ..., 0.2082, 0.4858, 0.4936])
Shape: torch.Size([22687, 22687])
NNZ: 54705
Density: 0.00010628522108964806
Time: 0.1423017978668213 seconds
Performance counter stats for 'apptainer run pytorch-altra.sif -c numactl --cpunodebind=0 --membind=0 python spmv.py matrices/p2p-Gnutella25.mtx 100':
318,386 BR_MIS_PRED_RETIRED:u # 0.0 per branch branch_misprediction_ratio
19,233,431 BR_RETIRED:u
3.555753224 seconds time elapsed
14.642518000 seconds user
30.112207000 seconds sys
/nfshomes/vut/ampere_research/pytorch/spmv.py:20: UserWarning: Sparse CSR tensor support is in beta state. If you miss a functionality in the sparse tensor support, please submit a feature request to https://github.com/pytorch/pytorch/issues. (Triggered internally at /space/jenkins/workspace/Releases/pytorch-dls/pytorch-dls/aten/src/ATen/SparseCsrTensorImpl.cpp:55.)
).to_sparse_csr().type(torch.float)
tensor(crow_indices=tensor([ 0, 9, 9, ..., 54704, 54704, 54705]),
col_indices=tensor([ 1, 2, 3, ..., 17949, 22685, 144]),
values=tensor([1., 1., 1., ..., 1., 1., 1.]), size=(22687, 22687),
nnz=54705, layout=torch.sparse_csr)
tensor([0.2250, 0.5676, 0.3018, ..., 0.5431, 0.7314, 0.5593])
Shape: torch.Size([22687, 22687])
NNZ: 54705
Density: 0.00010628522108964806
Time: 0.14638042449951172 seconds
Performance counter stats for 'apptainer run pytorch-altra.sif -c numactl --cpunodebind=0 --membind=0 python spmv.py matrices/p2p-Gnutella25.mtx 100':
27,039,805 L1I_TLB:u # 0.0 per TLB access itlb_walk_ratio
6,375 ITLB_WALK:u
17,290 DTLB_WALK:u # 0.0 per TLB access dtlb_walk_ratio
36,688,544 L1D_TLB:u
3.566915241 seconds time elapsed
16.116565000 seconds user
28.752519000 seconds sys
/nfshomes/vut/ampere_research/pytorch/spmv.py:20: UserWarning: Sparse CSR tensor support is in beta state. If you miss a functionality in the sparse tensor support, please submit a feature request to https://github.com/pytorch/pytorch/issues. (Triggered internally at /space/jenkins/workspace/Releases/pytorch-dls/pytorch-dls/aten/src/ATen/SparseCsrTensorImpl.cpp:55.)
).to_sparse_csr().type(torch.float)
tensor(crow_indices=tensor([ 0, 9, 9, ..., 54704, 54704, 54705]),
col_indices=tensor([ 1, 2, 3, ..., 17949, 22685, 144]),
values=tensor([1., 1., 1., ..., 1., 1., 1.]), size=(22687, 22687),
nnz=54705, layout=torch.sparse_csr)
tensor([0.0220, 0.7494, 0.7913, ..., 0.8924, 0.8542, 0.5491])
Shape: torch.Size([22687, 22687])
NNZ: 54705
Density: 0.00010628522108964806
Time: 0.17815685272216797 seconds
Performance counter stats for 'apptainer run pytorch-altra.sif -c numactl --cpunodebind=0 --membind=0 python spmv.py matrices/p2p-Gnutella25.mtx 100':
32,508,072 L1I_CACHE:u # 0.0 per cache access l1i_cache_miss_ratio
297,568 L1I_CACHE_REFILL:u
477,654 L1D_CACHE_REFILL:u # 0.0 per cache access l1d_cache_miss_ratio
34,044,579 L1D_CACHE:u
3.435706033 seconds time elapsed
14.690285000 seconds user
28.763423000 seconds sys
/nfshomes/vut/ampere_research/pytorch/spmv.py:20: UserWarning: Sparse CSR tensor support is in beta state. If you miss a functionality in the sparse tensor support, please submit a feature request to https://github.com/pytorch/pytorch/issues. (Triggered internally at /space/jenkins/workspace/Releases/pytorch-dls/pytorch-dls/aten/src/ATen/SparseCsrTensorImpl.cpp:55.)
).to_sparse_csr().type(torch.float)
tensor(crow_indices=tensor([ 0, 9, 9, ..., 54704, 54704, 54705]),
col_indices=tensor([ 1, 2, 3, ..., 17949, 22685, 144]),
values=tensor([1., 1., 1., ..., 1., 1., 1.]), size=(22687, 22687),
nnz=54705, layout=torch.sparse_csr)
tensor([0.6277, 0.4955, 0.9335, ..., 0.1476, 0.2079, 0.0931])
Shape: torch.Size([22687, 22687])
NNZ: 54705
Density: 0.00010628522108964806
Time: 0.14432048797607422 seconds
Performance counter stats for 'apptainer run pytorch-altra.sif -c numactl --cpunodebind=0 --membind=0 python spmv.py matrices/p2p-Gnutella25.mtx 100':
549,474 LL_CACHE_MISS_RD:u # 1.0 per cache access ll_cache_read_miss_ratio
561,939 LL_CACHE_RD:u
185,622 L2D_TLB:u # 0.1 per TLB access l2_tlb_miss_ratio
23,295 L2D_TLB_REFILL:u
305,878 L2D_CACHE_REFILL:u # 0.2 per cache access l2_cache_miss_ratio
1,763,089 L2D_CACHE:u
3.538826979 seconds time elapsed
15.006109000 seconds user
29.644298000 seconds sys