Skip to Content.
Sympa Menu

perfsonar-user - [perfsonar-user] No iperf2 offered on perfSONAR 5.0 when using pscheduler?

Subject: perfSONAR User Q&A and Other Discussion

List archive

[perfsonar-user] No iperf2 offered on perfSONAR 5.0 when using pscheduler?


Chronological Thread 
  • From: 장민석 <>
  • To:
  • Subject: [perfsonar-user] No iperf2 offered on perfSONAR 5.0 when using pscheduler?
  • Date: Fri, 02 Jun 2023 23:47:43 +0900

Hi, all.

I'm Minseok Jang from KREONET, South Korea.


I tried to configure regular tests between the following nodes using pScheduler to test a 100Gbps line between Korea and USA.

ps-daej.kreonet2.net : 5.0.1-1.el7 / Daejeon(Korea)

ps-chic.kreonet2.net : 4.4.3-1.el7 / Chicago(USA)


But pScheduler returns the following error:

Failed to post task: Unable to complete request: No tool in common among the participants:  ps-daej.kreonet2.net offered iperf3, nuttcp;  ps-chic.kreonet2.net offered iperf3, nuttcp, iperf2.


PS 5.0.1-1.el7 seems not to support iperf2 via pScheduler. Is that right?

How can I configure high throughput tests via pScheduler?


FIY, I attached two logs using iperf2 and pScheduler. 


Best Regards,

Minseok Jang

----

KREONET Center / Division of Science and Technology Digital Convergence

Korea Institute of Science and Technology Information


Email:

LinkedIn: https://www.linkedin.com/in/msjang

Phone: +82-42-869-1292

----


cf 1. Manual iperf2 test gives about 70Gbps.

ps-daej.kreonet2.net $ iperf -c ps-chic.kreonet2.net -t 10 -i 1 -P 8 -p 5891 -w 256MB -M 8972

WARNING: attempt to set TCP maximum segment size to 8972, but got 536

...

------------------------------------------------------------

Client connecting to ps-chic.kreonet2.net, TCP port 5891

TCP window size:  512 MByte (WARNING: requested  256 MByte)

------------------------------------------------------------

[  4] local 134.75.207.2 port 60538 connected with 134.75.253.114 port 5891

...

[ ID] Interval       Transfer     Bandwidth

[SUM]  0.0- 1.0 sec  3.67 GBytes  31.5 Gbits/sec

[SUM]  1.0- 2.0 sec   364 MBytes  3.06 Gbits/sec

[SUM]  2.0- 3.0 sec  10.1 GBytes  86.9 Gbits/sec

...

[SUM]  9.0-10.0 sec  9.78 GBytes  84.0 Gbits/sec

[SUM]  0.0-10.2 sec  81.2 GBytes  68.7 Gbits/sec


cf 2. Full command for pScheduler to test using iperf2.

$ A=ps-daej.kreonet2.net

$ B=ps-chic.kreonet2.net

$ pscheduler task --tool iperf2 --debug throughput -w 256M -m 8972 -s $A -d $B -i 1 -t 10 -P 8

2023-06-02T23:31:42 Debug started

2023-06-02T23:31:42 Assistance is from ps-daej.kreonet2.net bound from None

2023-06-02T23:31:42 Forcing default slip of PT5M

2023-06-02T23:31:42 Converting to spec via https://ps-daej.kreonet2.net/pscheduler/tests/throughput/spec

Submitting task...

2023-06-02T23:31:42 Fetching participant list

2023-06-02T23:31:42 Spec is: {"source":"ps-daej.kreonet2.net","dest":"ps-chic.kreonet2.net","duration":"PT10S","interval":"PT1S","parallel":8,"window-size":256000000,"mss":8972,"schema":1}

2023-06-02T23:31:42 Params are: {'spec': '{"source":"ps-daej.kreonet2.net","dest":"ps-chic.kreonet2.net","duration":"PT10S","interval":"PT1S","parallel":8,"window-size":256000000,"mss":8972,"schema":1}'}

2023-06-02T23:31:42 Got participants: {'participants': ['ps-daej.kreonet2.net', 'ps-chic.kreonet2.net']}

2023-06-02T23:31:42 Lead is ps-daej.kreonet2.net

2023-06-02T23:31:42 Pinging lead ps-daej.kreonet2.net

2023-06-02T23:31:42 Posting task to https://ps-daej.kreonet2.net/pscheduler/tasks

2023-06-02T23:31:42 Data is {'schedule': {'slip': 'PT5M'}, 'test': {'spec': {'source': 'ps-daej.kreonet2.net', 'dest': 'ps-chic.kreonet2.net', 'duration': 'PT10S', 'interval': 'PT1S', 'parallel': 8, 'window-size': 256000000, 'mss': 8972, 'schema': 1}, 'type': 'throughput'}, 'tools': ['iperf2']}

Failed to post task: Unable to complete request: No tool in common among the participants:  ps-daej.kreonet2.net offered iperf3, nuttcp;  ps-chic.kreonet2.net offered iperf3, nuttcp, iperf2.


The 'pscheduler troubleshoot' command may be of use in problem

diagnosis. 'pscheduler troubleshoot --help' for more information.




Archive powered by MHonArc 2.6.24.

Top of Page