srun: Job time limit was unset; set to partition default of 60 minutes srun: ################################################################################ srun: # Please note that the oasis compute nodes have aarch64 architecture CPUs. # srun: # All submission nodes and all other compute nodes have x86_64 architecture # srun: # CPUs. Programs, environments, or other software that was built on x86_64 # srun: # nodes may need to be rebuilt to properly execute on these nodes. # srun: ################################################################################ srun: job 3394994 queued and waiting for resources srun: job 3394994 has been allocated resources /nfshomes/vut/ampere_research/pytorch/spmv.py:20: UserWarning: Sparse CSR tensor support is in beta state. If you miss a functionality in the sparse tensor support, please submit a feature request to https://github.com/pytorch/pytorch/issues. (Triggered internally at /space/jenkins/workspace/Releases/pytorch-dls/pytorch-dls/aten/src/ATen/SparseCsrTensorImpl.cpp:55.) ).to_sparse_csr().type(torch.float) tensor(crow_indices=tensor([ 0, 9, 9, ..., 54704, 54704, 54705]), col_indices=tensor([ 1, 2, 3, ..., 17949, 22685, 144]), values=tensor([1., 1., 1., ..., 1., 1., 1.]), size=(22687, 22687), nnz=54705, layout=torch.sparse_csr) tensor([0.1465, 0.4354, 0.7334, ..., 0.2837, 0.5913, 0.9525]) Matrix: p2p-Gnutella25 Shape: torch.Size([22687, 22687]) NNZ: 54705 Density: 0.00010628522108964806 Time: 1.4786670207977295 seconds Performance counter stats for 'apptainer run pytorch-altra.sif -c numactl --cpunodebind=0 --membind=0 python spmv.py matrices/p2p-Gnutella25.mtx 1000': 48.61 msec task-clock:u # 0.010 CPUs utilized 0 context-switches:u # 0.000 /sec 0 cpu-migrations:u # 0.000 /sec 3,308 page-faults:u # 68.054 K/sec 60,072,179 cycles:u # 1.236 GHz (53.26%) 70,991,785 instructions:u # 1.18 insn per cycle (71.54%) branches:u 371,197 branch-misses:u 32,964,378 L1-dcache-loads:u # 678.165 M/sec 465,448 L1-dcache-load-misses:u # 1.41% of all L1-dcache accesses LLC-loads:u LLC-load-misses:u 31,435,424 L1-icache-loads:u # 646.710 M/sec 293,561 L1-icache-load-misses:u # 0.93% of all L1-icache accesses 56,761,270 dTLB-loads:u # 1.168 G/sec (30.54%) dTLB-load-misses:u (0.00%) iTLB-loads:u (0.00%) iTLB-load-misses:u (0.00%) 4.700046411 seconds time elapsed 16.235801000 seconds user 28.396327000 seconds sys /nfshomes/vut/ampere_research/pytorch/spmv.py:20: UserWarning: Sparse CSR tensor support is in beta state. If you miss a functionality in the sparse tensor support, please submit a feature request to https://github.com/pytorch/pytorch/issues. (Triggered internally at /space/jenkins/workspace/Releases/pytorch-dls/pytorch-dls/aten/src/ATen/SparseCsrTensorImpl.cpp:55.) ).to_sparse_csr().type(torch.float) tensor(crow_indices=tensor([ 0, 9, 9, ..., 54704, 54704, 54705]), col_indices=tensor([ 1, 2, 3, ..., 17949, 22685, 144]), values=tensor([1., 1., 1., ..., 1., 1., 1.]), size=(22687, 22687), nnz=54705, layout=torch.sparse_csr) tensor([0.7780, 0.3388, 0.1540, ..., 0.2989, 0.3682, 0.9160]) Matrix: p2p-Gnutella25 Shape: torch.Size([22687, 22687]) NNZ: 54705 Density: 0.00010628522108964806 Time: 1.4235138893127441 seconds Performance counter stats for 'apptainer run pytorch-altra.sif -c numactl --cpunodebind=0 --membind=0 python spmv.py matrices/p2p-Gnutella25.mtx 1000': 331,765 BR_MIS_PRED_RETIRED:u # 0.0 per branch branch_misprediction_ratio 19,906,014 BR_RETIRED:u 4.757340585 seconds time elapsed 16.412311000 seconds user 29.238029000 seconds sys /nfshomes/vut/ampere_research/pytorch/spmv.py:20: UserWarning: Sparse CSR tensor support is in beta state. If you miss a functionality in the sparse tensor support, please submit a feature request to https://github.com/pytorch/pytorch/issues. (Triggered internally at /space/jenkins/workspace/Releases/pytorch-dls/pytorch-dls/aten/src/ATen/SparseCsrTensorImpl.cpp:55.) ).to_sparse_csr().type(torch.float) tensor(crow_indices=tensor([ 0, 9, 9, ..., 54704, 54704, 54705]), col_indices=tensor([ 1, 2, 3, ..., 17949, 22685, 144]), values=tensor([1., 1., 1., ..., 1., 1., 1.]), size=(22687, 22687), nnz=54705, layout=torch.sparse_csr) tensor([0.4944, 0.8057, 0.8211, ..., 0.5137, 0.3388, 0.6316]) Matrix: p2p-Gnutella25 Shape: torch.Size([22687, 22687]) NNZ: 54705 Density: 0.00010628522108964806 Time: 1.4664146900177002 seconds Performance counter stats for 'apptainer run pytorch-altra.sif -c numactl --cpunodebind=0 --membind=0 python spmv.py matrices/p2p-Gnutella25.mtx 1000': 28,194,337 L1I_TLB:u # 0.0 per TLB access itlb_walk_ratio 5,083 ITLB_WALK:u 17,916 DTLB_WALK:u # 0.0 per TLB access dtlb_walk_ratio 37,944,713 L1D_TLB:u 4.844329421 seconds time elapsed 16.081022000 seconds user 28.021902000 seconds sys /nfshomes/vut/ampere_research/pytorch/spmv.py:20: UserWarning: Sparse CSR tensor support is in beta state. If you miss a functionality in the sparse tensor support, please submit a feature request to https://github.com/pytorch/pytorch/issues. (Triggered internally at /space/jenkins/workspace/Releases/pytorch-dls/pytorch-dls/aten/src/ATen/SparseCsrTensorImpl.cpp:55.) ).to_sparse_csr().type(torch.float) tensor(crow_indices=tensor([ 0, 9, 9, ..., 54704, 54704, 54705]), col_indices=tensor([ 1, 2, 3, ..., 17949, 22685, 144]), values=tensor([1., 1., 1., ..., 1., 1., 1.]), size=(22687, 22687), nnz=54705, layout=torch.sparse_csr) tensor([0.0963, 0.5806, 0.0397, ..., 0.1604, 0.5700, 0.8103]) Matrix: p2p-Gnutella25 Shape: torch.Size([22687, 22687]) NNZ: 54705 Density: 0.00010628522108964806 Time: 1.3717434406280518 seconds Performance counter stats for 'apptainer run pytorch-altra.sif -c numactl --cpunodebind=0 --membind=0 python spmv.py matrices/p2p-Gnutella25.mtx 1000': 31,162,212 L1I_CACHE:u # 0.0 per cache access l1i_cache_miss_ratio 270,684 L1I_CACHE_REFILL:u 465,467 L1D_CACHE_REFILL:u # 0.0 per cache access l1d_cache_miss_ratio 32,857,500 L1D_CACHE:u 4.598461782 seconds time elapsed 15.609727000 seconds user 30.606837000 seconds sys /nfshomes/vut/ampere_research/pytorch/spmv.py:20: UserWarning: Sparse CSR tensor support is in beta state. If you miss a functionality in the sparse tensor support, please submit a feature request to https://github.com/pytorch/pytorch/issues. (Triggered internally at /space/jenkins/workspace/Releases/pytorch-dls/pytorch-dls/aten/src/ATen/SparseCsrTensorImpl.cpp:55.) ).to_sparse_csr().type(torch.float) tensor(crow_indices=tensor([ 0, 9, 9, ..., 54704, 54704, 54705]), col_indices=tensor([ 1, 2, 3, ..., 17949, 22685, 144]), values=tensor([1., 1., 1., ..., 1., 1., 1.]), size=(22687, 22687), nnz=54705, layout=torch.sparse_csr) tensor([0.9137, 0.5009, 0.7507, ..., 0.6623, 0.8760, 0.2991]) Matrix: p2p-Gnutella25 Shape: torch.Size([22687, 22687]) NNZ: 54705 Density: 0.00010628522108964806 Time: 1.4291880130767822 seconds Performance counter stats for 'apptainer run pytorch-altra.sif -c numactl --cpunodebind=0 --membind=0 python spmv.py matrices/p2p-Gnutella25.mtx 1000': 541,118 LL_CACHE_MISS_RD:u # 1.0 per cache access ll_cache_read_miss_ratio 564,199 LL_CACHE_RD:u 194,022 L2D_TLB:u # 0.1 per TLB access l2_tlb_miss_ratio 23,932 L2D_TLB_REFILL:u 311,476 L2D_CACHE_REFILL:u # 0.2 per cache access l2_cache_miss_ratio 1,783,574 L2D_CACHE:u 4.792239951 seconds time elapsed 15.902307000 seconds user 28.747620000 seconds sys