Hey Steven,
Thanks for your reply. The output of "emanesh n2 get stat '*' phy" is 
below, but I'm not quite sure which are related to location events. I 
guess 'processedUpstreamControl/processedDownstreamControl'?

[emanesh (n2:47000)] ## get stat * phy
nem 1   phy  avgDownstreamProcessingDelay0 = 130.09324646
nem 1   phy  avgProcessAPIQueueDepth = 1.00073424808
nem 1   phy  avgProcessAPIQueueWait = 61.8266486164
nem 1   phy  avgTimedEventLatency = 0.0
nem 1   phy  avgTimedEventLatencyRatio = 0.0
nem 1   phy  avgUpstreamProcessingDelay0 = 146.843505859
nem 1   phy  numDownstreamBytesBroadcastGenerated0 = 0
nem 1   phy  numDownstreamBytesBroadcastRx0 = 997828
nem 1   phy  numDownstreamBytesBroadcastTx0 = 997828
nem 1   phy  numDownstreamBytesUnicastGenerated0 = 0
nem 1   phy  numDownstreamBytesUnicastRx0 = 0
nem 1   phy  numDownstreamBytesUnicastTx0 = 0
nem 1   phy  numDownstreamPacketsBroadcastDrop0 = 0
nem 1   phy  numDownstreamPacketsBroadcastGenerated0 = 0
nem 1   phy  numDownstreamPacketsBroadcastRx0 = 7372
nem 1   phy  numDownstreamPacketsBroadcastTx0 = 7372
nem 1   phy  numDownstreamPacketsUnicastDrop0 = 0
nem 1   phy  numDownstreamPacketsUnicastGenerated0 = 0
nem 1   phy  numDownstreamPacketsUnicastRx0 = 0
nem 1   phy  numDownstreamPacketsUnicastTx0 = 0
nem 1   phy  numUpstreamBytesBroadcastRx0 = 4565630
nem 1   phy  numUpstreamBytesBroadcastTx0 = 2990224
nem 1   phy  numUpstreamBytesUnicastRx0 = 0
nem 1   phy  numUpstreamBytesUnicastTx0 = 0
nem 1   phy  numUpstreamPacketsBroadcastDrop0 = 14124
nem 1   phy  numUpstreamPacketsBroadcastRx0 = 36202
nem 1   phy  numUpstreamPacketsBroadcastTx0 = 22078
nem 1   phy  numUpstreamPacketsUnicastDrop0 = 0
nem 1   phy  numUpstreamPacketsUnicastRx0 = 0
nem 1   phy  numUpstreamPacketsUnicastTx0 = 0
nem 1   phy  processedConfiguration = 0
nem 1   phy  processedDownstreamControl = 0
nem 1   phy  processedDownstreamPackets = 7372
nem 1   phy  processedEvents = 8
nem 1   phy  processedTimedEvents = 0
nem 1   phy  processedUpstreamControl = 0
nem 1   phy  processedUpstreamPackets = 36202


I'm a bit puzzled why this wouldn't just work out of the box for 
CORE/EMANE, as I'm basically using the default configuration except that 
I have created a separate multicast group (224.1.2.9) and control 
interface (ctrl1 and eth1) for OTA traffic (as recommended by the CORE 
docs).

Dan








On 01/02/16 19:39, Steven Galgano wrote:
> Dan,
>
> Your XML indicates you are using 2ray which requires location events:
>
>   <param name="propagationmodel" value="2ray"/>
>
> It is easier to debug emane using the control port. Connect to each
> emulator instance and verify you are receiving location events using the
> emulator physical layer statistic tables.
>
> See the emane tutorial for more information on using the control port to
> debug an emulation:
>
>    https://github.com/adjacentlink/emane-tutorial/wiki
>
> --
> Steven Galgano
> Adjacent Link LLC
> www.adjacentlink.com
>
>
> On 02/01/2016 02:29 PM, Dan O'Keeffe wrote:
>> Hi,
>> I'm having trouble getting a distributed CORE/EMANE emulation working
>> because of what I think is an EMANE problem. In particular, OLSR
>> broadcast packets received from remote containers are dropped by emane
>> with something like the following log message:
>>
>> DEBUG PHYI 001 FrameworkPHY::processUpstreamPacket_i transmitter 4, src
>> 4, dst 65535, drop propagation model missing info
>>
>> I have 2 emulation servers in total, with 2 nodes running OLSR inside a
>> container on each (started using the CORE GUI).
>>
>> Nodes on the same machine can communicate fine, but not nodes on
>> different machines.
>>
>> Using tcpdump I can see that OLSR broadcast packets are reaching the
>> emane daemon running in the node containers on different machines.
>>
>> Does anyone know what my problem might be or how to debug it further?
>>
>> I'm running emane 0.9.2 and core 4.8. I've attached the emane configs
>> generated by core for one of the nodes in case that helps. The configs
>> are identical on the other machine except for nem and tap ids.
>>
>> Thanks,
>> Dan
>>
>>
>> _______________________________________________
>> emane-users mailing list
>> [email protected]
>> http://pf.itd.nrl.navy.mil/mailman/listinfo/emane-users
>>

_______________________________________________
emane-users mailing list
[email protected]
http://pf.itd.nrl.navy.mil/mailman/listinfo/emane-users

Reply via email to