I've traced this down to a problem resulting from the bwctld on
bwctl.atla.net.internet2.edu not producing the correct metadata in
its output to clients.
The output header should look something like this:
bwctl: start_endpoint: 3624691387.449529
bwctl: run_endpoint: receiver: 64.57.16.66
bwctl: run_endpoint: sender: 138.26.220.65
bwctl: exec_line: iperf -c 64.57.16.66 -B 138.26.220.65 -f b
-P 1 -m -p 5014 -t 20 -i 1.000000
bwctl: run_tool: tester: iperf
bwctl: run_tool: receiver: 64.57.16.66
bwctl: run_tool: sender: 138.26.220.65
bwctl: start_tool: 3624691432.432794
But the problematic node is only producing a metadata header that
contains:
bwctl: exec_line: /usr/bin/iperf -B 64.57.16.66 -s
-f b -P 1 -m -p 5014 -t 20 -i 1
bwctl: start_tool: 3624691430.401291
This causes the perfSONAR scripts to miss-parse the results, failing
on the $tool assignment because it doesn't match "bwctl:
exec_line: iperf". I changed that pattern to accept full paths
and then the basic result parse succeeds but, due to missing source
data, cannot be recorded in the tests database.
I'm not sure why this node is behaving this way.
You can reproduce this behavior by running the following tests.
Both the send and recv files should include the complete header,
however, when bwctl.atla.net.internet2.edu returns data it produces
an incomplete header which then causes the failed parsing by
perfSONAR.
Test with bwctl.atla as receiver produces the borked header:
/usr/bin/bwctl -s
perfsonar-10g-scidmz.uabgrid.uab.edu -c
bwctl.atla.net.internet2.edu -i 1 --parsable -4 -T iperf -p -d
./ -v -P 1 -t 20 -x
$ head 15567942049171924331.recv.bw
bwctl: exec_line: /usr/bin/iperf -B 64.57.16.66 -s -f b -P 1
-m -p 5015 -t 20 -i 1
bwctl: start_tool: 3624693965.158426
------------------------------------------------------------
Server listening on TCP port 5015
Binding to local address 64.57.16.66
TCP window size: 87380 Byte (default)
------------------------------------------------------------
[ 15] local 64.57.16.66 port 5015 connected with 138.26.220.65
port 49750
[ ID] Interval Transfer Bandwidth
[ 15] 0.0- 1.0 sec 62630432 Bytes 501043456 bits/sec
Tests with working node as receiver produces a legit header:
/usr/bin/bwctl -s
perfsonar-10g-scidmz.uabgrid.uab.edu -c 56m-ps.sox.net -i 1
--parsable -4 -T iperf -p -d ./ -v -P 1 -t 20 -x
$ head 15567942589368762368.recv.bw
bwctl: start_endpoint: 3624694082.873576
bwctl: run_endpoint: receiver: 143.215.194.125
bwctl: run_endpoint: sender: 138.26.220.65
bwctl: exec_line: iperf -B 143.215.194.125 -s -f b -P 1 -m -p
5128 -t 20 -i 1.000000
bwctl: run_tool: tester: iperf
bwctl: run_tool: receiver: 143.215.194.125
bwctl: run_tool: sender: 138.26.220.65
bwctl: start_tool: 3624694091.000167
------------------------------------------------------------
Server listening on TCP port 5128
I've contacted the maintainers of the failed node but thought the
debug info might help others determine if their problematic one-way
target nodes are behaving similarly.
~jpr
On 11/10/2014 04:58 PM, John-Paul
Robinson wrote:
While the cause of the problem does seem to be a failure to parse
the iperf output, the source of the problem doesn't appear to be
Bwctl.pm but rather the bwctl process itself.
When I run bwctl against a known working end point:
/usr/bin/bwctl -c 56m-ps.sox.net -i 1 --parsable
-4 -T iperf -p -d ./soxtest/ -v -P 1 -t 20 -I 300
The results file created contains the "locally" gather bwctl
results which includes the correctly parsable header that
describes the metadata for the test:
$ cat soxtest/15567672641991999488.recv.bw
bwctl: start_endpoint: 3624631187.210311
bwctl: run_endpoint: receiver: 143.215.194.125
bwctl: run_endpoint: sender: 138.26.220.65
bwctl: exec_line: iperf -B 143.215.194.125 -s -f b -P 1 -m
-p 5045 -t 20 -i 1.000000
bwctl: run_tool: tester: iperf
bwctl: run_tool: receiver: 143.215.194.125
bwctl: run_tool: sender: 138.26.220.65
bwctl: start_tool: 3624631239.004069
However, when I run an identical test against the problematic
endpoint:
/usr/bin/bwctl -c 64.57.16.66 -T iperf -4 -I 900
-p -d i2atla-iperfonly -v -P 1 -t 20 -i 1 --parsable
The bwctl result file contains the "remotely" gather results which
has insufficient metadata for correct parsing:
$ cat
i2atla-iperfonly/15567745226308775992.recv.bw
bwctl: exec_line: /usr/bin/iperf -B 64.57.16.66 -s -f b -P 1
-m -p 5009 -t 20 -i 1
bwctl: start_tool: 3624648139.253983
------------------------------------------------------------
Server listening on TCP port 5009
Binding to local address 64.57.16.66
TCP window size: 87380 Byte (default)
------------------------------------------------------------
It's actually the first line with the /usr/bin/iperf path that
causes the unparsable errors reported earlier in Bwctl.pm
(below). I changed the pattern match thinking this was the bug
but although the data is parsed it then failes to be added to the
cassandra data base because of a missing source address (since
there isn't one on the iperf server side). stracing the bwctl
process shows that the parsable local data is recieved but is not
written to the test output file.
It's not clear why bwctl would have this odd behavior for just
this one host. It seems almost as if the internal stdout/stderr
channels are somehow crossed. If bwctl wrote the correct, locally
generated output it would have the necessary metadata header and
would parse with the existing Bwctl.pm code.
Any thoughts on what's going on here?
~jpr
On 11/10/2014 01:31 PM, John-Paul
Robinson wrote:
Hi
Folks,
From what I can tell the problem my tests are running into has
to do with having iperf results instead of iperf3 results. The
host that is failing negotiates tests with iperf. All my other
test endpoints negotiate tests with iperf3.
It appears there is some sort of parsing error in the
parse_bwctl_output() routine. The error in the
regular_testing.log relates to the $tool variable not being set
which eventually bubbles up to full parsing error (see the log
below in my previous email).
The code segment where $tool would be set looks like this:
elsif ($line =~ /bwctl: exec_line: iperf/)
{
$results{tool} =
"iperf";
}
elsif ($line =~ /bwctl: exec_line: nuttcp/)
{
$results{tool} =
"nuttcp";
}
elsif ($line =~ /bwctl: run_tool: tester: (.*)/)
{
$results{tool} = $1;
It appears when the test is an "iperf" test it should parse the
line "bwctl: exec_line: iperf". For whatever reason the value
for tool doesn't get set and that triggers the subsequent
warnings and eventual parse failure.
I suspect this is why the src-to-dst test is failing and why I'm
not seeing the results for these tests in my graphs.
~jpr
On 11/07/2014 10:22 AM, John-Paul
Robinson wrote:
I'm
seeing something similar with one of my tests. It works
bi-directionally on the command line but the automated web
test is no longer working in the local-to-remote direction.
I've poked around the regular testing log some and suspect
there may be some empty data value borking the test results by
causing perl to barf on the test results.
Here is the log for the failed local-to-remote part of the
tests. Note that the test seems to complete given that the
gap in the log entry time matches the reported wait time:
2014/11/07 00:00:21 (13317) DEBUG>
Cmd.pm:130
perfSONAR_PS::RegularTesting::Utils::CmdRunner::Cmd::readlines
- Lines: $VAR1 = [
'bwctl: Using
138.26.220.65 as the address for local sender',
'bwctl: Using
64.57.16.66 as the address for remote receiver',
'bwctl: Available
in-common: iperf nuttcp ',
'bwctl: Using tool:
iperf'
];
2014/11/07 00:00:21 (13317) DEBUG> Cmd.pm:130
perfSONAR_PS::RegularTesting::Utils::CmdRunner::Cmd::readlines
- Lines: $VAR1 = [
'bwctl: Server \'64.57.16.66\' accepted test
request at time 1415340026.515434'
];
2014/11/07 00:00:21 (13317) DEBUG> Cmd.pm:130
perfSONAR_PS::RegularTesting::Utils::CmdRunner::Cmd::readlines
- Lines: $VAR1 = [
'bwctl: Client \'localhost\' accepted test
request at time 1415340026.515434'
];
2014/11/07 00:00:21 (13317) DEBUG> Cmd.pm:130
perfSONAR_PS::RegularTesting::Utils::CmdRunner::Cmd::readlines
- Lines: $VAR1 = [
'bwctl: 26 seconds until test results
available'
];
2014/11/07 00:00:47 (13317) DEBUG> Cmd.pm:130
perfSONAR_PS::RegularTesting::Utils::CmdRunner::Cmd::readlines
- Lines: $VAR1 = [
'/var/lib/perfsonar/regular_testing/bwctl_Du5FB/15566373779833847474.recv.bw'
];
2014/11/07 00:00:47 (13317) WARN> daemon:103
main::__ANON__ - Warned: Use of uninitialized value $tool
in string eq at
/opt/perfsonar_ps/regular_testing/bin/../lib/perfSONAR_PS/RegularTesting/Parsers/Bwctl.pm
line 117.
2014/11/07 00:00:47 (13317) WARN> daemon:103
main::__ANON__ - Warned: Use of uninitialized value $tool
in string eq at
/opt/perfsonar_ps/regular_testing/bin/../lib/perfSONAR_PS/RegularTesting/Parsers/Bwctl.pm
line 120.
2014/11/07 00:00:47 (13317) WARN> daemon:103
main::__ANON__ - Warned: Use of uninitialized value $tool
in string eq at
/opt/perfsonar_ps/regular_testing/bin/../lib/perfSONAR_PS/RegularTesting/Parsers/Bwctl.pm
line 123.
2014/11/07 00:00:47 (13317) WARN> daemon:103
main::__ANON__ - Warned: Use of uninitialized value $tool
in string eq at
/opt/perfsonar_ps/regular_testing/bin/../lib/perfSONAR_PS/RegularTesting/Parsers/Bwctl.pm
line 126.
2014/11/07 00:00:47 (13317) WARN> daemon:103
main::__ANON__ - Warned: Use of uninitialized value $tool
in string eq at
/opt/perfsonar_ps/regular_testing/bin/../lib/perfSONAR_PS/RegularTesting/Parsers/Bwctl.pm
line 129.
2014/11/07 00:00:47 (13317) WARN> daemon:103
main::__ANON__ - Warned: Use of uninitialized value $tool
in string eq at
/opt/perfsonar_ps/regular_testing/bin/../lib/perfSONAR_PS/RegularTesting/Parsers/Bwctl.pm
line 132.
2014/11/07 00:00:47 (13317) WARN> daemon:103
main::__ANON__ - Warned: Use of uninitialized value $tool
in concatenation (.) or string at
/opt/perfsonar_ps/regular_testing/bin/../lib/perfSONAR_PS/RegularTesting/Parsers/Bwctl.pm
line 136.
2014/11/07 00:00:47 (13317) ERROR> CmdRunner.pm:110
perfSONAR_PS::RegularTesting::Utils::CmdRunner::run -
Problem with results callback: Attribute (tool) does not
pass the type constraint because: Validation failed for
'Str' with value undef at
/opt/perfsonar_ps/regular_testing/bin/../lib/perfSONAR_PS/RegularTesting/Tests/Bwctl.pm
line 119
perfSONAR_PS::RegularTesting::Tests::Bwctl::__ANON__('perfSONAR_PS::RegularTesting::Tests::Bwctl=HASH(0x46501d8)',
'HASH(0x4656c38)') called at
/usr/lib64/perl5/vendor_perl/Moose/Meta/Method/Overridden.pm
line 36
perfSONAR_PS::RegularTesting::Tests::Bwctl::build_results('perfSONAR_PS::RegularTesting::Tests::Bwctl=HASH(0x46501d8)',
'HASH(0x4656c38)') called at
/opt/perfsonar_ps/regular_testing/bin/../lib/perfSONAR_PS/RegularTesting/Tests/BwctlBase.pm
line 237
perfSONAR_PS::RegularTesting::Tests::BwctlBase::handle_output('perfSONAR_PS::RegularTesting::Tests::Bwctl=HASH(0x46501d8)',
'HASH(0x4880ad0)') called at
/opt/perfsonar_ps/regular_testing/bin/../lib/perfSONAR_PS/RegularTesting/Tests/BwctlBase.pm
line 180
perfSONAR_PS::RegularTesting::Tests::BwctlBase::__ANON__('perfSONAR_PS::RegularTesting::Utils::CmdRunner::Cmd=HASH(0x47...',
'HASH(0x46a2850)') called at
/opt/perfsonar_ps/regular_testing/bin/../lib/perfSONAR_PS/RegularTesting/Utils/CmdRunner.pm
line 108
eval {...} called at
/opt/perfsonar_ps/regular_testing/bin/../lib/perfSONAR_PS/RegularTesting/Utils/CmdRunner.pm
line 107
perfSONAR_PS::RegularTesting::Utils::CmdRunner::run('perfSONAR_PS::RegularTesting::Utils::CmdRunner=HASH(0x478afc0)')
called at
/opt/perfsonar_ps/regular_testing/bin/../lib/perfSONAR_PS/RegularTesting/Tests/BwctlBase.pm
line 188
perfSONAR_PS::RegularTesting::Tests::BwctlBase::__ANON__('perfSONAR_PS::RegularTesting::Tests::Bwctl=HASH(0x46501d8)',
'HASH(0x4755cd8)') called at
/usr/lib64/perl5/vendor_perl/Moose/Meta/Method/Overridden.pm
line 36
perfSONAR_PS::RegularTesting::Tests::BwctlBase::run_test('perfSONAR_PS::RegularTesting::Tests::Bwctl=HASH(0x46501d8)',
'HASH(0x4755cd8)') called at
/opt/perfsonar_ps/regular_testing/bin/../lib/perfSONAR_PS/RegularTesting/Test.pm
line 75
perfSONAR_PS::RegularTesting::Test::run_test('perfSONAR_PS::RegularTesting::Test=HASH(0x465ed40)',
'handle_results', 'CODE(0x1415c78)') called at
/opt/perfsonar_ps/regular_testing/bin/../lib/perfSONAR_PS/RegularTesting/Master/SelfScheduledTestChild.pm
line 50
eval {...} called at
/opt/perfsonar_ps/regular_testing/bin/../lib/perfSONAR_PS/RegularTesting/Master/SelfScheduledTestChild.pm
line 40
perfSONAR_PS::RegularTesting::Master::SelfScheduledTestChild::__ANON__('perfSONAR_PS::RegularTesting::Master::SelfScheduledTestChild=...')
called at
/usr/lib64/perl5/vendor_perl/Moose/Meta/Method/Overridden.pm
line 36
perfSONAR_PS::RegularTesting::Master::SelfScheduledTestChild::child_main_loop('perfSONAR_PS::RegularTesting::Master::SelfScheduledTestChild=...')
called at
/opt/perfsonar_ps/regular_testing/bin/../lib/perfSONAR_PS/RegularTesting/Master/BaseChild.pm
line 32
perfSONAR_PS::RegularTesting::Master::BaseChild::run('perfSONAR_PS::RegularTesting::Master::SelfScheduledTestChild=...')
called at
/opt/perfsonar_ps/regular_testing/bin/../lib/perfSONAR_PS/RegularTesting/Master.pm
line 182
perfSONAR_PS::RegularTesting::Master::run('perfSONAR_PS::RegularTesting::Master=HASH(0x4671bb8)')
called at /opt/perfsonar_ps/regular_testing/bin/daemon
line 122
2014/11/07 00:00:47 (13317) DEBUG> Cmd.pm:130
perfSONAR_PS::RegularTesting::Utils::CmdRunner::Cmd::readlines
- Lines: $VAR1 = [
'bwctl: 913 seconds until next testing
period'
];
Here is the corresponding part of the successful test for the
remote-to-local part of the test. It runs the test, gets the
results, parses them and sticks them in the database (given
that I see them in the graphs):
2014/11/07 00:01:27 (13317) DEBUG>
Cmd.pm:130
perfSONAR_PS::RegularTesting::Utils::CmdRunner::Cmd::readlines
- Lines: $VAR1 = [
'bwctl: Using
bwctl.atla.net.internet2.edu as the address for remote
sender',
'bwctl: Using
138.26.220.65 as the address for local receiver',
'bwctl: Available
in-common: iperf nuttcp ',
'bwctl: Using tool:
iperf'
];
2014/11/07 00:01:27 (13317) DEBUG> Cmd.pm:130
perfSONAR_PS::RegularTesting::Utils::CmdRunner::Cmd::readlines
- Lines: $VAR1 = [
'bwctl: Server \'localhost\' accepted test
request at time 1415340091.510944'
];
2014/11/07 00:01:27 (13317) DEBUG> Cmd.pm:130
perfSONAR_PS::RegularTesting::Utils::CmdRunner::Cmd::readlines
- Lines: $VAR1 = [
'bwctl: Client
\'bwctl.atla.net.internet2.edu\' accepted test request at
time 1415340091.510944'
];
2014/11/07 00:01:27 (13317) DEBUG> Cmd.pm:130
perfSONAR_PS::RegularTesting::Utils::CmdRunner::Cmd::readlines
- Lines: $VAR1 = [
'bwctl: 25 seconds until test results
available'
];
2014/11/07 00:01:52 (13317) DEBUG> Cmd.pm:130
perfSONAR_PS::RegularTesting::Utils::CmdRunner::Cmd::readlines
- Lines: $VAR1 = [
'/var/lib/perfsonar/regular_testing/bwctl_BkMZ9/15566374058987434689.recv.bw'
];
2014/11/07 00:01:52 (13317) DEBUG> Bwctl.pm:138
perfSONAR_PS::RegularTesting::Tests::Bwctl::__ANON__ -
Build Results: $VAR1 = {
'summary_results' => {
'summary_results' =>
{
'throughput' => '4480866964'
},
'streams' => [
{
'throughput' => '4480866964',
'stream_id' => '15'
}
]
},
'protocol' => 'tcp',
'source' => {
'protocol' => 'tcp',
'address' => '64.57.16.66',
'hostname' =>
'bwctl.atla.net.internet2.edu'
},
'errors' => [],
'destination' => {
'protocol' => 'tcp',
'address' =>
'138.26.220.65'
},
'raw_results' => 'bwctl: start_endpoint:
3624328887.156787
bwctl: run_endpoint: receiver: 138.26.220.65
bwctl: run_endpoint: sender: 64.57.16.66
bwctl: exec_line: iperf -B 138.26.220.65 -s -f b -P 1 -m
-p 5182 -t 20 -i 1.000000
bwctl: run_tool: tester: iperf
bwctl: run_tool: receiver: 138.26.220.65
bwctl: run_tool: sender: 64.57.16.66
bwctl: start_tool: 3624328889.822644
------------------------------------------------------------
Server listening on TCP port 5182
Binding to local address 138.26.220.65
TCP window size: 87380 Byte (default)
------------------------------------------------------------
[ 15] local 138.26.220.65 port 5182 connected with
64.57.16.66 port 5182
[ ID] Interval Transfer Bandwidth
[ 15] 0.0- 1.0 sec 278847688 Bytes 2230781504
bits/sec
[ 15] 1.0- 2.0 sec 574872044 Bytes 4598976352
bits/sec
[ 15] 2.0- 3.0 sec 574443704 Bytes 4595549632
bits/sec
[ 15] 3.0- 4.0 sec 575490620 Bytes 4603924960
bits/sec
[ 15] 4.0- 5.0 sec 573504164 Bytes 4588033312
bits/sec
[ 15] 5.0- 6.0 sec 576805976 Bytes 4614447808
bits/sec
[ 15] 6.0- 7.0 sec 575192188 Bytes 4601537504
bits/sec
[ 15] 7.0- 8.0 sec 575457480 Bytes 4603659840
bits/sec
[ 15] 8.0- 9.0 sec 574730536 Bytes 4597844288
bits/sec
[ 15] 9.0-10.0 sec 573548904 Bytes 4588391232
bits/sec
[ 15] 10.0-11.0 sec 574434756 Bytes 4595478048
bits/sec
[ 15] 11.0-12.0 sec 575257476 Bytes 4602059808
bits/sec
[ 15] 12.0-13.0 sec 575052664 Bytes 4600421312
bits/sec
[ 15] 13.0-14.0 sec 575186388 Bytes 4601491104
bits/sec
[ 15] 14.0-15.0 sec 575061116 Bytes 4600488928
bits/sec
[ 15] 15.0-16.0 sec 575683832 Bytes 4605470656
bits/sec
[ 15] 16.0-17.0 sec 575449524 Bytes 4603596192
bits/sec
[ 15] 17.0-18.0 sec 574237900 Bytes 4593903200
bits/sec
[ 15] 18.0-19.0 sec 574121576 Bytes 4592972608
bits/sec
[ 15] 19.0-20.0 sec 574765336 Bytes 4598122688
bits/sec
[ 15] 0.0-20.0 sec 11209015296 Bytes 4480866964
bits/sec
[ 15] MSS size 8948 bytes (MTU 8988 bytes, unknown
interface)
bwctl: stop_tool: 3624328911.581018
bwctl: stop_endpoint: 3624328912.581453
',
'start_time' => '2014-11-07T06:01:29',
'end_time' => '2014-11-07T06:01:51',
'tool' => 'iperf',
'streams' => 1,
'type' => 'throughput',
'intervals' => [
... and so forth
This gives me some confidence that there is a problem with the
local-to-remote.
~jpr
On 11/04/2014 08:53 AM, Jarvis
Zhang wrote:
Hi,
I set several tests on my host, but some tests only work
unidirectionally. For throughput
This problem is a little complicate. Here is my address: http://115.25.138.244/serviceTest/psGraph.cgi
The throughput test works on command line
but fails on web interface.
The one-way delay test doesn’t work even on
command line; I think maybe the problem has something to
do with the port, but I have no idea where to start
with.
The ping delay test never work.
Attachment is regular_testing.log. Here is
more details:
1. For throughput test:
a) 115.25.138.244(my
host) to 128.104.100.67(komatsu) stops working since I
update from v3.3.2 to 3.4 yesterday. but the command
line still works.
[zwzhang@jsi-inspur02
perfsonar]$ bwctl -c 128.104.100.67
bwctl: Using tool: iperf
bwctl: 27 seconds until test results available
RECEIVER START
bwctl: exec_line: iperf -B komatsu.chtc.wisc.edu
-s -f b -m -p 5019 -t 10
bwctl: start_tool: 3624097458.082252
------------------------------------------------------------
Server listening on TCP port 5019
Binding to local address komatsu.chtc.wisc.edu
TCP window size: 87380 Byte (default)
------------------------------------------------------------
[ 14] local 128.104.100.67 port 5019 connected with
115.25.138.244 port 52456
[ ID] Interval Transfer Bandwidth
[ 14] 0.0-11.5 sec 36438016 Bytes 25453583
bits/sec
[ 14] MSS size 1436 bytes (MTU 1500 bytes, ethernet)
bwctl: stop_exec: 3624097473.816542
RECEIVER END
b)
2001:da8:203:d406:16da:e9ff:fef9:b68f(my host)
to 2607:f388:107c:501:202:c9ff:fee8:85a0(komatsu) never works,
but the command line still works, as above.
c)
159.226.15.235(cnic) to 115.25.138.244(my host) stops
working since I update from v3.3.2 to 3.4 yesterday.
so does the command line.
[zwzhang@jsi-inspur02
perfsonar]$ bwctl -s 159.226.15.235
bwctl: Using tool: iperf
bwctl: 20 seconds until test results available
RECEIVER START
bwctl: remote peer cancelled test
RECEIVER END
d)
2400:dd01:1011:1:92b1:1cff:fe0c:7c0a(cnic)
to 2001:da8:203:d406:16da:e9ff:fef9:b68f(my
host) stops working since I update from v3.3.2 to 3.4
yesterday. so does the command line, as above.
e) 115.25.138.244
to 137.110.119.113 works well bidirectionally.
2. For one-way latency
a)
128.104.100.67(komatsu) to 115.25.138.244(my host)
doesn’t work, so does the command line.
--- owping statistics
from [128.104.100.67]:8944 to [jsi-inspur02]:8841
---
SID: 73198af4d8035c80a36f5244c1e63df9
first: 2014-11-04T22:09:06.866
last: 2014-11-04T22:09:17.026
100 sent, 100 lost (100.000%), 0 duplicates
one-way delay min/median/max = nan/nan/nan ms,
(err=0.104 ms)
one-way jitter = nan ms (P95-P50)
TTL not reported
no reordering
b)159.226.15.235(cnic)
to 115.25.138.244(my host) doesn’t work , so
does the command line.
c)
137.110.119.113(murpa) doesn’t work bidirectionally.
d) Other ipv6
addresses work well.
3. All ping latency doesn’t work.
Any
insights appreciated.
Thanks,
Jarvis
|