ampere_research/pytorch/output/altra_2_2_p2p-Gnutella24_100.output
2024-12-03 00:20:09 -05:00

154 lines
9.4 KiB
Plaintext

srun: Job time limit was unset; set to partition default of 60 minutes
srun: ################################################################################
srun: # Please note that the oasis compute nodes have aarch64 architecture CPUs. #
srun: # All submission nodes and all other compute nodes have x86_64 architecture #
srun: # CPUs. Programs, environments, or other software that was built on x86_64 #
srun: # nodes may need to be rebuilt to properly execute on these nodes. #
srun: ################################################################################
srun: job 3394141 queued and waiting for resources
srun: job 3394141 has been allocated resources
/nfshomes/vut/ampere_research/pytorch/spmv.py:20: UserWarning: Sparse CSR tensor support is in beta state. If you miss a functionality in the sparse tensor support, please submit a feature request to https://github.com/pytorch/pytorch/issues. (Triggered internally at /space/jenkins/workspace/Releases/pytorch-dls/pytorch-dls/aten/src/ATen/SparseCsrTensorImpl.cpp:55.)
).to_sparse_csr().type(torch.float)
tensor(crow_indices=tensor([ 0, 9, 9, ..., 65369, 65369, 65369]),
col_indices=tensor([ 1, 2, 3, ..., 15065, 9401, 26517]),
values=tensor([1., 1., 1., ..., 1., 1., 1.]), size=(26518, 26518),
nnz=65369, layout=torch.sparse_csr)
tensor([0.6616, 0.1149, 0.0110, ..., 0.2481, 0.7877, 0.5589])
Shape: torch.Size([26518, 26518])
NNZ: 65369
Density: 9.295875717624285e-05
Time: 0.16974925994873047 seconds
Performance counter stats for 'apptainer run pytorch-altra.sif -c numactl --cpunodebind=0 --membind=0 python spmv.py matrices/p2p-Gnutella24.mtx 100':
61.92 msec task-clock:u # 0.017 CPUs utilized
0 context-switches:u # 0.000 /sec
0 cpu-migrations:u # 0.000 /sec
3,281 page-faults:u # 52.988 K/sec
66,250,810 cycles:u # 1.070 GHz (62.94%)
75,178,179 instructions:u # 1.13 insn per cycle (83.47%)
<not supported> branches:u
367,749 branch-misses:u
33,064,095 L1-dcache-loads:u # 533.986 M/sec
465,542 L1-dcache-load-misses:u # 1.41% of all L1-dcache accesses
<not supported> LLC-loads:u
<not supported> LLC-load-misses:u
31,552,264 L1-icache-loads:u # 509.570 M/sec
296,060 L1-icache-load-misses:u # 0.94% of all L1-icache accesses
73,155,896 dTLB-loads:u # 1.181 G/sec (17.31%)
<not counted> dTLB-load-misses:u (0.00%)
<not counted> iTLB-loads:u (0.00%)
<not counted> iTLB-load-misses:u (0.00%)
3.675971385 seconds time elapsed
14.857293000 seconds user
29.791187000 seconds sys
/nfshomes/vut/ampere_research/pytorch/spmv.py:20: UserWarning: Sparse CSR tensor support is in beta state. If you miss a functionality in the sparse tensor support, please submit a feature request to https://github.com/pytorch/pytorch/issues. (Triggered internally at /space/jenkins/workspace/Releases/pytorch-dls/pytorch-dls/aten/src/ATen/SparseCsrTensorImpl.cpp:55.)
).to_sparse_csr().type(torch.float)
tensor(crow_indices=tensor([ 0, 9, 9, ..., 65369, 65369, 65369]),
col_indices=tensor([ 1, 2, 3, ..., 15065, 9401, 26517]),
values=tensor([1., 1., 1., ..., 1., 1., 1.]), size=(26518, 26518),
nnz=65369, layout=torch.sparse_csr)
tensor([0.1683, 0.8999, 0.0578, ..., 0.5893, 0.0628, 0.8262])
Shape: torch.Size([26518, 26518])
NNZ: 65369
Density: 9.295875717624285e-05
Time: 0.2227163314819336 seconds
Performance counter stats for 'apptainer run pytorch-altra.sif -c numactl --cpunodebind=0 --membind=0 python spmv.py matrices/p2p-Gnutella24.mtx 100':
332,366 BR_MIS_PRED_RETIRED:u # 0.0 per branch branch_misprediction_ratio
19,076,182 BR_RETIRED:u
3.532329673 seconds time elapsed
14.883993000 seconds user
28.516661000 seconds sys
/nfshomes/vut/ampere_research/pytorch/spmv.py:20: UserWarning: Sparse CSR tensor support is in beta state. If you miss a functionality in the sparse tensor support, please submit a feature request to https://github.com/pytorch/pytorch/issues. (Triggered internally at /space/jenkins/workspace/Releases/pytorch-dls/pytorch-dls/aten/src/ATen/SparseCsrTensorImpl.cpp:55.)
).to_sparse_csr().type(torch.float)
tensor(crow_indices=tensor([ 0, 9, 9, ..., 65369, 65369, 65369]),
col_indices=tensor([ 1, 2, 3, ..., 15065, 9401, 26517]),
values=tensor([1., 1., 1., ..., 1., 1., 1.]), size=(26518, 26518),
nnz=65369, layout=torch.sparse_csr)
tensor([0.8389, 0.5614, 0.9033, ..., 0.2231, 0.0349, 0.5167])
Shape: torch.Size([26518, 26518])
NNZ: 65369
Density: 9.295875717624285e-05
Time: 0.17095375061035156 seconds
Performance counter stats for 'apptainer run pytorch-altra.sif -c numactl --cpunodebind=0 --membind=0 python spmv.py matrices/p2p-Gnutella24.mtx 100':
27,005,133 L1I_TLB:u # 0.0 per TLB access itlb_walk_ratio
4,791 ITLB_WALK:u
13,403 DTLB_WALK:u # 0.0 per TLB access dtlb_walk_ratio
36,457,054 L1D_TLB:u
3.579041343 seconds time elapsed
14.885159000 seconds user
29.562650000 seconds sys
/nfshomes/vut/ampere_research/pytorch/spmv.py:20: UserWarning: Sparse CSR tensor support is in beta state. If you miss a functionality in the sparse tensor support, please submit a feature request to https://github.com/pytorch/pytorch/issues. (Triggered internally at /space/jenkins/workspace/Releases/pytorch-dls/pytorch-dls/aten/src/ATen/SparseCsrTensorImpl.cpp:55.)
).to_sparse_csr().type(torch.float)
tensor(crow_indices=tensor([ 0, 9, 9, ..., 65369, 65369, 65369]),
col_indices=tensor([ 1, 2, 3, ..., 15065, 9401, 26517]),
values=tensor([1., 1., 1., ..., 1., 1., 1.]), size=(26518, 26518),
nnz=65369, layout=torch.sparse_csr)
tensor([0.8849, 0.5982, 0.0578, ..., 0.9975, 0.2204, 0.0718])
Shape: torch.Size([26518, 26518])
NNZ: 65369
Density: 9.295875717624285e-05
Time: 0.18003463745117188 seconds
Performance counter stats for 'apptainer run pytorch-altra.sif -c numactl --cpunodebind=0 --membind=0 python spmv.py matrices/p2p-Gnutella24.mtx 100':
32,367,686 L1I_CACHE:u # 0.0 per cache access l1i_cache_miss_ratio
287,524 L1I_CACHE_REFILL:u
467,557 L1D_CACHE_REFILL:u # 0.0 per cache access l1d_cache_miss_ratio
34,022,862 L1D_CACHE:u
3.405321132 seconds time elapsed
15.291636000 seconds user
28.005015000 seconds sys
/nfshomes/vut/ampere_research/pytorch/spmv.py:20: UserWarning: Sparse CSR tensor support is in beta state. If you miss a functionality in the sparse tensor support, please submit a feature request to https://github.com/pytorch/pytorch/issues. (Triggered internally at /space/jenkins/workspace/Releases/pytorch-dls/pytorch-dls/aten/src/ATen/SparseCsrTensorImpl.cpp:55.)
).to_sparse_csr().type(torch.float)
tensor(crow_indices=tensor([ 0, 9, 9, ..., 65369, 65369, 65369]),
col_indices=tensor([ 1, 2, 3, ..., 15065, 9401, 26517]),
values=tensor([1., 1., 1., ..., 1., 1., 1.]), size=(26518, 26518),
nnz=65369, layout=torch.sparse_csr)
tensor([0.2790, 0.1291, 0.6053, ..., 0.1651, 0.4973, 0.6821])
Shape: torch.Size([26518, 26518])
NNZ: 65369
Density: 9.295875717624285e-05
Time: 0.22036528587341309 seconds
Performance counter stats for 'apptainer run pytorch-altra.sif -c numactl --cpunodebind=0 --membind=0 python spmv.py matrices/p2p-Gnutella24.mtx 100':
535,707 LL_CACHE_MISS_RD:u # 1.0 per cache access ll_cache_read_miss_ratio
556,316 LL_CACHE_RD:u
150,149 L2D_TLB:u # 0.1 per TLB access l2_tlb_miss_ratio
18,418 L2D_TLB_REFILL:u
297,042 L2D_CACHE_REFILL:u # 0.2 per cache access l2_cache_miss_ratio
1,687,364 L2D_CACHE:u
3.505209576 seconds time elapsed
15.297738000 seconds user
29.848441000 seconds sys