Hi All,

I have single node with 5 resources running on it. When I rebooted node
sometimes I saw resources in stopped state though node comes online.

When looked in to the logs, one difference found in success and failure
case is, when
*Election Trigger (I_DC_TIMEOUT) just popped (20000ms)  *occurred LRM did
not start the resources instead jumped to monitor action and then onwards
it did not start the resources at all.

But in success case this Election timeout did not come and first action
taken by LRM was start the resource and then start monitoring it making all
the resources started properly.

I have attached both the success and failure logs. Could some one please
explain the reason for this issue  and how to solve this ?


My CRM configuration is -

root@sc-node-2:~# crm configure show
node $id="2" sc-node-2
primitive oc-fw-agent upstart:oc-fw-agent \
meta allow-migrate="true" migration-threshold="5" failure-timeout="120s" \
op monitor interval="15s" timeout="60s"
primitive oc-lb-agent upstart:oc-lb-agent \
meta allow-migrate="true" migration-threshold="5" failure-timeout="120s" \
op monitor interval="15s" timeout="60s"
primitive oc-service-manager upstart:oc-service-manager \
meta allow-migrate="true" migration-threshold="5" failure-timeout="120s" \
op monitor interval="15s" timeout="60s"
primitive oc-vpn-agent upstart:oc-vpn-agent \
meta allow-migrate="true" migration-threshold="5" failure-timeout="120s" \
op monitor interval="15s" timeout="60s"
primitive sc_vip ocf:heartbeat:IPaddr2 \
params ip="200.10.10.188" cidr_netmask="24" nic="eth1" \
op monitor interval="15s"
group sc-resources sc_vip oc-service-manager oc-fw-agent oc-lb-agent
oc-vpn-agent
property $id="cib-bootstrap-options" \
dc-version="1.1.10-42f2063" \
cluster-infrastructure="corosync" \
stonith-enabled="false" \
cluster-recheck-interval="3min" \
default-action-timeout="180s"


-- 
Thanks and Regards,
Pritam Kharat.
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: do_state_transition:     
State transition S_INTEGRATION -> S_FINALIZE_JOIN [ input=I_INTEGRATED 
cause=C_FSA_INTERNAL origin=check_join_state ]
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: crmd_join_phase_log:     
join-1: sc-node-2=integrated
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: do_dc_join_finalize:     
join-1: Syncing our CIB to the rest of the cluster
Oct 29 13:02:15 [1016] sc-node-2        cib:   notice: corosync_node_name:      
Unable to get node name for nodeid 2
Oct 29 13:02:15 [1016] sc-node-2        cib:   notice: get_node_name:   
Defaulting to uname -n for the local corosync node name
Oct 29 13:02:15 [1016] sc-node-2        cib:     info: cib_process_request:     
Completed cib_sync operation for section 'all': OK (rc=0, origin=local/crmd/14, 
version=0.11.0)
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: crm_update_peer_join:    
finalize_join_for: Node sc-node-2[2] - join-1 phase 2 -> 3
Oct 29 13:02:15 [1016] sc-node-2        cib:     info: cib_process_request:     
Completed cib_modify operation for section nodes: OK (rc=0, 
origin=local/crmd/15, version=0.11.0)
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: erase_status_tag:        
Deleting xpath: //node_state[@uname='sc-node-2']/transient_attributes
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: update_attrd:    
Connecting to attrd... 5 retries remaining
Oct 29 13:02:15 [1016] sc-node-2        cib:     info: cib_process_request:     
Completed cib_delete operation for section 
//node_state[@uname='sc-node-2']/transient_attributes: OK (rc=0, 
origin=local/crmd/16, version=0.11.0)
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: crm_update_peer_join:    
do_dc_join_ack: Node sc-node-2[2] - join-1 phase 3 -> 4
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: do_dc_join_ack:  join-1: 
Updating node state to member for sc-node-2
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: erase_status_tag:        
Deleting xpath: //node_state[@uname='sc-node-2']/lrm
Oct 29 13:02:15 [1016] sc-node-2        cib:     info: cib_process_request:     
Completed cib_delete operation for section 
//node_state[@uname='sc-node-2']/lrm: OK (rc=0, origin=local/crmd/17, 
version=0.11.0)
Oct 29 13:02:15 [1016] sc-node-2        cib:     info: cib_process_request:     
Completed cib_modify operation for section status: OK (rc=0, 
origin=local/crmd/18, version=0.11.1)
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: do_state_transition:     
State transition S_FINALIZE_JOIN -> S_POLICY_ENGINE [ input=I_FINALIZED 
cause=C_FSA_INTERNAL origin=check_join_state ]
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: abort_transition_graph:  
do_te_invoke:151 - Triggered transition abort (complete=1) : Peer Cancelled
Oct 29 13:02:15 [1019] sc-node-2      attrd:   notice: attrd_local_callback:    
Sending full refresh (origin=crmd)
Oct 29 13:02:15 [1016] sc-node-2        cib:     info: cib_process_request:     
Completed cib_modify operation for section nodes: OK (rc=0, 
origin=local/crmd/19, version=0.11.1)
Oct 29 13:02:15 [1016] sc-node-2        cib:     info: cib_process_request:     
Completed cib_modify operation for section status: OK (rc=0, 
origin=local/crmd/20, version=0.11.1)
Oct 29 13:02:15 [1016] sc-node-2        cib:     info: cib_process_request:     
Completed cib_modify operation for section cib: OK (rc=0, origin=local/crmd/21, 
version=0.11.2)
Oct 29 13:02:15 [1016] sc-node-2        cib:     info: cib_process_request:     
Completed cib_query operation for section 'all': OK (rc=0, 
origin=local/crmd/22, version=0.11.2)
Oct 29 13:02:15 [1020] sc-node-2    pengine:     info: determine_online_status: 
        Node sc-node-2 is online
Oct 29 13:02:15 [1020] sc-node-2    pengine:     info: group_print:      
Resource Group: sc-resources
Oct 29 13:02:15 [1020] sc-node-2    pengine:     info: native_print:         
sc_vip     (ocf::heartbeat:IPaddr2):       Stopped 
Oct 29 13:02:15 [1020] sc-node-2    pengine:     info: native_print:         
oc-service-manager (upstart:oc-service-manager):   Stopped 
Oct 29 13:02:15 [1020] sc-node-2    pengine:     info: native_print:         
oc-fw-agent        (upstart:oc-fw-agent):  Stopped 
Oct 29 13:02:15 [1020] sc-node-2    pengine:     info: native_print:         
oc-lb-agent        (upstart:oc-lb-agent):  Stopped 
Oct 29 13:02:15 [1020] sc-node-2    pengine:     info: native_print:         
oc-vpn-agent       (upstart:oc-vpn-agent): Stopped 
Oct 29 13:02:15 [1020] sc-node-2    pengine:     info: RecurringOp:      Start 
recurring monitor (15s) for sc_vip on sc-node-2
Oct 29 13:02:15 [1020] sc-node-2    pengine:     info: RecurringOp:      Start 
recurring monitor (15s) for oc-service-manager on sc-node-2
Oct 29 13:02:15 [1020] sc-node-2    pengine:     info: RecurringOp:      Start 
recurring monitor (15s) for oc-fw-agent on sc-node-2
Oct 29 13:02:15 [1020] sc-node-2    pengine:     info: RecurringOp:      Start 
recurring monitor (15s) for oc-lb-agent on sc-node-2
Oct 29 13:02:15 [1020] sc-node-2    pengine:     info: RecurringOp:      Start 
recurring monitor (15s) for oc-vpn-agent on sc-node-2
Oct 29 13:02:15 [1020] sc-node-2    pengine:   notice: LogActions:      Start   
sc_vip  (sc-node-2)
Oct 29 13:02:15 [1020] sc-node-2    pengine:   notice: LogActions:      Start   
oc-service-manager      (sc-node-2)
Oct 29 13:02:15 [1020] sc-node-2    pengine:   notice: LogActions:      Start   
oc-fw-agent     (sc-node-2)
Oct 29 13:02:15 [1020] sc-node-2    pengine:   notice: LogActions:      Start   
oc-lb-agent     (sc-node-2)
Oct 29 13:02:15 [1020] sc-node-2    pengine:   notice: LogActions:      Start   
oc-vpn-agent    (sc-node-2)
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: do_state_transition:     
State transition S_POLICY_ENGINE -> S_TRANSITION_ENGINE [ input=I_PE_SUCCESS 
cause=C_IPC_MESSAGE origin=handle_response ]
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: do_te_invoke:    
Processing graph 0 (ref=pe_calc-dc-1446138135-7) derived from 
/var/lib/pacemaker/pengine/pe-input-340.bz2
Oct 29 13:02:15 [1021] sc-node-2       crmd:   notice: te_rsc_command:  
Initiating action 4: monitor sc_vip_monitor_0 on sc-node-2 (local)
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: 
process_lrmd_get_rsc_info:       Resource 'sc_vip' not found (0 active 
resources)
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: 
process_lrmd_rsc_register:       Added 'sc_vip' to the rsc list (1 active 
resources)
Oct 29 13:02:15 [1020] sc-node-2    pengine:   notice: process_pe_message:      
Calculated Transition 0: /var/lib/pacemaker/pengine/pe-input-340.bz2
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: do_lrm_rsc_op:   
Performing key=4:0:7:419f8af1-d7a7-4913-b197-2d89ff519e66 op=sc_vip_monitor_0
Oct 29 13:02:15 [1021] sc-node-2       crmd:   notice: te_rsc_command:  
Initiating action 5: monitor oc-service-manager_monitor_0 on sc-node-2 (local)
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: 
process_lrmd_get_rsc_info:       Resource 'oc-service-manager' not found (1 
active resources)
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: 
process_lrmd_rsc_register:       Added 'oc-service-manager' to the rsc list (2 
active resources)
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: do_lrm_rsc_op:   
Performing key=5:0:7:419f8af1-d7a7-4913-b197-2d89ff519e66 
op=oc-service-manager_monitor_0
Oct 29 13:02:15 [1021] sc-node-2       crmd:   notice: te_rsc_command:  
Initiating action 6: monitor oc-fw-agent_monitor_0 on sc-node-2 (local)
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: get_first_instance:      
Result: (null)
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: upstart_job_running:     
oc-service-manager is not running
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: 
process_lrmd_get_rsc_info:       Resource 'oc-fw-agent' not found (2 active 
resources)
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: 
process_lrmd_rsc_register:       Added 'oc-fw-agent' to the rsc list (3 active 
resources)
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: do_lrm_rsc_op:   
Performing key=6:0:7:419f8af1-d7a7-4913-b197-2d89ff519e66 
op=oc-fw-agent_monitor_0
Oct 29 13:02:15 [1021] sc-node-2       crmd:   notice: te_rsc_command:  
Initiating action 7: monitor oc-lb-agent_monitor_0 on sc-node-2 (local)
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: get_first_instance:      
Result: (null)
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: upstart_job_running:     
oc-fw-agent is not running
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: 
process_lrmd_get_rsc_info:       Resource 'oc-lb-agent' not found (3 active 
resources)
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: 
process_lrmd_rsc_register:       Added 'oc-lb-agent' to the rsc list (4 active 
resources)
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: do_lrm_rsc_op:   
Performing key=7:0:7:419f8af1-d7a7-4913-b197-2d89ff519e66 
op=oc-lb-agent_monitor_0
Oct 29 13:02:15 [1021] sc-node-2       crmd:   notice: te_rsc_command:  
Initiating action 8: monitor oc-vpn-agent_monitor_0 on sc-node-2 (local)
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: get_first_instance:      
Result: (null)
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: upstart_job_running:     
oc-lb-agent is not running
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: 
process_lrmd_get_rsc_info:       Resource 'oc-vpn-agent' not found (4 active 
resources)
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: 
process_lrmd_rsc_register:       Added 'oc-vpn-agent' to the rsc list (5 active 
resources)
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: do_lrm_rsc_op:   
Performing key=8:0:7:419f8af1-d7a7-4913-b197-2d89ff519e66 
op=oc-vpn-agent_monitor_0
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: get_first_instance:      
Result: (null)
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: upstart_job_running:     
oc-vpn-agent is not running
Oct 29 13:02:15 [1016] sc-node-2        cib:     info: cib_process_request:     
Completed cib_modify operation for section status: OK (rc=0, 
origin=local/crmd/23, version=0.11.3)
Oct 29 13:02:15 [1021] sc-node-2       crmd:   notice: process_lrm_event:       
LRM operation oc-service-manager_monitor_0 (call=9, rc=7, cib-update=23, 
confirmed=true) not running
Oct 29 13:02:15 [1021] sc-node-2       crmd:   notice: process_lrm_event:       
LRM operation oc-fw-agent_monitor_0 (call=13, rc=7, cib-update=24, 
confirmed=true) not running
Oct 29 13:02:15 [1016] sc-node-2        cib:     info: cib_process_request:     
Completed cib_modify operation for section status: OK (rc=0, 
origin=local/crmd/24, version=0.11.4)
Oct 29 13:02:15 [1021] sc-node-2       crmd:   notice: process_lrm_event:       
LRM operation oc-lb-agent_monitor_0 (call=17, rc=7, cib-update=25, 
confirmed=true) not running
Oct 29 13:02:15 [1016] sc-node-2        cib:     info: cib_process_request:     
Completed cib_modify operation for section status: OK (rc=0, 
origin=local/crmd/25, version=0.11.5)
Oct 29 13:02:15 [1021] sc-node-2       crmd:   notice: process_lrm_event:       
LRM operation oc-vpn-agent_monitor_0 (call=21, rc=7, cib-update=26, 
confirmed=true) not running
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: match_graph_event:       
Action oc-service-manager_monitor_0 (5) confirmed on sc-node-2 (rc=0)
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: match_graph_event:       
Action oc-fw-agent_monitor_0 (6) confirmed on sc-node-2 (rc=0)
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: match_graph_event:       
Action oc-lb-agent_monitor_0 (7) confirmed on sc-node-2 (rc=0)
Oct 29 13:02:15 [1016] sc-node-2        cib:     info: cib_process_request:     
Completed cib_modify operation for section status: OK (rc=0, 
origin=local/crmd/26, version=0.11.6)
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: match_graph_event:       
Action oc-vpn-agent_monitor_0 (8) confirmed on sc-node-2 (rc=0)
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: 
services_os_action_execute:      Managed IPaddr2_meta-data_0 process 1336 
exited with rc=0
Oct 29 13:02:15 [1021] sc-node-2       crmd:   notice: process_lrm_event:       
LRM operation sc_vip_monitor_0 (call=5, rc=7, cib-update=27, confirmed=true) 
not running
Oct 29 13:02:15 [1016] sc-node-2        cib:     info: cib_process_request:     
Completed cib_modify operation for section status: OK (rc=0, 
origin=local/crmd/27, version=0.11.7)
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: match_graph_event:       
Action sc_vip_monitor_0 (4) confirmed on sc-node-2 (rc=0)
Oct 29 13:02:15 [1021] sc-node-2       crmd:   notice: te_rsc_command:  
Initiating action 3: probe_complete probe_complete on sc-node-2 (local) - no 
waiting
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: te_rsc_command:  Action 
3 confirmed - no wait
Oct 29 13:02:15 [1019] sc-node-2      attrd:   notice: attrd_trigger_update:    
Sending flush op to all hosts for: probe_complete (true)
Oct 29 13:02:15 [1021] sc-node-2       crmd:   notice: te_rsc_command:  
Initiating action 9: start sc_vip_start_0 on sc-node-2 (local)
Oct 29 13:02:15 [1016] sc-node-2        cib:     info: cib_process_request:     
Completed cib_query operation for section 
//cib/status//node_state[@id='2']//transient_attributes//nvpair[@name='probe_complete']:
 No such device or address (rc=-6, origin=local/attrd/2, version=0.11.7)
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: do_lrm_rsc_op:   
Performing key=9:0:0:419f8af1-d7a7-4913-b197-2d89ff519e66 op=sc_vip_start_0
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: log_execute:     
executing - rsc:sc_vip action:start call_id:28
Oct 29 13:02:15 [1016] sc-node-2        cib:     info: cib_process_request:     
Completed cib_query operation for section /cib: OK (rc=0, origin=local/attrd/3, 
version=0.11.7)
Oct 29 13:02:15 [1019] sc-node-2      attrd:   notice: attrd_perform_update:    
Sent update 4: probe_complete=true
Oct 29 13:02:15 [1016] sc-node-2        cib:     info: cib_process_request:     
Completed cib_modify operation for section status: OK (rc=0, 
origin=local/attrd/4, version=0.11.8)
Oct 29 13:02:15 [1019] sc-node-2      attrd:   notice: corosync_node_name:      
Unable to get node name for nodeid 2
Oct 29 13:02:15 [1019] sc-node-2      attrd:   notice: get_node_name:   
Defaulting to uname -n for the local corosync node name
IPaddr2(sc_vip)[1340]:  2015/10/29_13:02:15 INFO: Adding IPv4 address 
200.10.10.188/24 with broadcast address 200.10.10.255 to device eth1
IPaddr2(sc_vip)[1340]:  2015/10/29_13:02:15 INFO: Bringing device eth1 up
IPaddr2(sc_vip)[1340]:  2015/10/29_13:02:15 INFO: /usr/lib/heartbeat/send_arp 
-i 200 -r 5 -p /var/run/resource-agents/send_arp-200.10.10.188 eth1 
200.10.10.188 auto not_used not_used
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: log_finished:    
finished - rsc:sc_vip action:start call_id:28 pid:1340 exit-code:0 
exec-time:38ms queue-time:0ms
Oct 29 13:02:15 [1021] sc-node-2       crmd:   notice: process_lrm_event:       
LRM operation sc_vip_start_0 (call=28, rc=0, cib-update=28, confirmed=true) ok
Oct 29 13:02:15 [1016] sc-node-2        cib:     info: cib_process_request:     
Completed cib_modify operation for section status: OK (rc=0, 
origin=local/crmd/28, version=0.11.9)
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: match_graph_event:       
Action sc_vip_start_0 (9) confirmed on sc-node-2 (rc=0)
Oct 29 13:02:15 [1021] sc-node-2       crmd:   notice: te_rsc_command:  
Initiating action 10: monitor sc_vip_monitor_15000 on sc-node-2 (local)
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: do_lrm_rsc_op:   
Performing key=10:0:0:419f8af1-d7a7-4913-b197-2d89ff519e66 
op=sc_vip_monitor_15000
Oct 29 13:02:15 [1021] sc-node-2       crmd:   notice: te_rsc_command:  
Initiating action 11: start oc-service-manager_start_0 on sc-node-2 (local)
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: do_lrm_rsc_op:   
Performing key=11:0:0:419f8af1-d7a7-4913-b197-2d89ff519e66 
op=oc-service-manager_start_0
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: log_execute:     
executing - rsc:oc-service-manager action:start call_id:33
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: upstart_job_exec_done:   
Call to start passed: type '(o)' 
/com/ubuntu/Upstart/jobs/oc_2dservice_2dmanager/_
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: log_finished:    
finished - rsc:oc-service-manager action:start call_id:33  exit-code:0 
exec-time:19ms queue-time:0ms
Oct 29 13:02:15 [1021] sc-node-2       crmd:   notice: process_lrm_event:       
LRM operation oc-service-manager_start_0 (call=33, rc=0, cib-update=29, 
confirmed=true) ok
Oct 29 13:02:15 [1016] sc-node-2        cib:     info: cib_process_request:     
Completed cib_modify operation for section status: OK (rc=0, 
origin=local/crmd/29, version=0.11.10)
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: match_graph_event:       
Action oc-service-manager_start_0 (11) confirmed on sc-node-2 (rc=0)
Oct 29 13:02:15 [1021] sc-node-2       crmd:   notice: te_rsc_command:  
Initiating action 12: monitor oc-service-manager_monitor_15000 on sc-node-2 
(local)
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: do_lrm_rsc_op:   
Performing key=12:0:0:419f8af1-d7a7-4913-b197-2d89ff519e66 
op=oc-service-manager_monitor_15000
Oct 29 13:02:15 [1021] sc-node-2       crmd:   notice: te_rsc_command:  
Initiating action 13: start oc-fw-agent_start_0 on sc-node-2 (local)
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: get_first_instance:      
Result: /com/ubuntu/Upstart/jobs/oc_2dservice_2dmanager/_
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: upstart_job_property:    
Calling GetAll on /com/ubuntu/Upstart/jobs/oc_2dservice_2dmanager/_
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: upstart_job_property:    
Call to GetAll passed: type '(a{sv})' 1
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: upstart_job_property:    
Got value 'running' for /com/ubuntu/Upstart/jobs/oc_2dservice_2dmanager/_[state]
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: upstart_job_running:     
State of oc-service-manager: running
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: upstart_job_running:     
oc-service-manager is running
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: do_lrm_rsc_op:   
Performing key=13:0:0:419f8af1-d7a7-4913-b197-2d89ff519e66 
op=oc-fw-agent_start_0
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: log_execute:     
executing - rsc:oc-fw-agent action:start call_id:38
Oct 29 13:02:15 [1021] sc-node-2       crmd:   notice: process_lrm_event:       
LRM operation oc-service-manager_monitor_15000 (call=36, rc=0, cib-update=30, 
confirmed=false) ok
Oct 29 13:02:15 [1016] sc-node-2        cib:     info: cib_process_request:     
Completed cib_modify operation for section status: OK (rc=0, 
origin=local/crmd/30, version=0.11.11)
Oct 29 13:02:15 [1021] sc-node-2       crmd:   notice: process_lrm_event:       
LRM operation sc_vip_monitor_15000 (call=31, rc=0, cib-update=31, 
confirmed=false) ok
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: match_graph_event:       
Action oc-service-manager_monitor_15000 (12) confirmed on sc-node-2 (rc=0)
Oct 29 13:02:15 [1016] sc-node-2        cib:     info: cib_process_request:     
Completed cib_modify operation for section status: OK (rc=0, 
origin=local/crmd/31, version=0.11.12)
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: match_graph_event:       
Action sc_vip_monitor_15000 (10) confirmed on sc-node-2 (rc=0)
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: upstart_job_exec_done:   
Call to start passed: type '(o)' /com/ubuntu/Upstart/jobs/oc_2dfw_2dagent/_
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: log_finished:    
finished - rsc:oc-fw-agent action:start call_id:38  exit-code:0 exec-time:29ms 
queue-time:0ms
Oct 29 13:02:15 [1021] sc-node-2       crmd:   notice: process_lrm_event:       
LRM operation oc-fw-agent_start_0 (call=38, rc=0, cib-update=32, 
confirmed=true) ok
Oct 29 13:02:15 [1016] sc-node-2        cib:     info: cib_process_request:     
Completed cib_modify operation for section status: OK (rc=0, 
origin=local/crmd/32, version=0.11.13)
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: match_graph_event:       
Action oc-fw-agent_start_0 (13) confirmed on sc-node-2 (rc=0)
Oct 29 13:02:15 [1021] sc-node-2       crmd:   notice: te_rsc_command:  
Initiating action 14: monitor oc-fw-agent_monitor_15000 on sc-node-2 (local)
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: do_lrm_rsc_op:   
Performing key=14:0:0:419f8af1-d7a7-4913-b197-2d89ff519e66 
op=oc-fw-agent_monitor_15000
Oct 29 13:02:15 [1021] sc-node-2       crmd:   notice: te_rsc_command:  
Initiating action 15: start oc-lb-agent_start_0 on sc-node-2 (local)
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: get_first_instance:      
Result: /com/ubuntu/Upstart/jobs/oc_2dfw_2dagent/_
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: upstart_job_property:    
Calling GetAll on /com/ubuntu/Upstart/jobs/oc_2dfw_2dagent/_
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: upstart_job_property:    
Call to GetAll passed: type '(a{sv})' 1
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: upstart_job_property:    
Got value 'running' for /com/ubuntu/Upstart/jobs/oc_2dfw_2dagent/_[state]
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: upstart_job_running:     
State of oc-fw-agent: running
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: upstart_job_running:     
oc-fw-agent is running
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: do_lrm_rsc_op:   
Performing key=15:0:0:419f8af1-d7a7-4913-b197-2d89ff519e66 
op=oc-lb-agent_start_0
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: log_execute:     
executing - rsc:oc-lb-agent action:start call_id:45
Oct 29 13:02:15 [1021] sc-node-2       crmd:   notice: process_lrm_event:       
LRM operation oc-fw-agent_monitor_15000 (call=43, rc=0, cib-update=33, 
confirmed=false) ok
Oct 29 13:02:15 [1016] sc-node-2        cib:     info: cib_process_request:     
Completed cib_modify operation for section status: OK (rc=0, 
origin=local/crmd/33, version=0.11.14)
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: match_graph_event:       
Action oc-fw-agent_monitor_15000 (14) confirmed on sc-node-2 (rc=0)
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: upstart_job_exec_done:   
Call to start passed: type '(o)' /com/ubuntu/Upstart/jobs/oc_2dlb_2dagent/_
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: log_finished:    
finished - rsc:oc-lb-agent action:start call_id:45  exit-code:0 exec-time:23ms 
queue-time:0ms
Oct 29 13:02:15 [1021] sc-node-2       crmd:   notice: process_lrm_event:       
LRM operation oc-lb-agent_start_0 (call=45, rc=0, cib-update=34, 
confirmed=true) ok
Oct 29 13:02:15 [1016] sc-node-2        cib:     info: cib_process_request:     
Completed cib_modify operation for section status: OK (rc=0, 
origin=local/crmd/34, version=0.11.15)
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: match_graph_event:       
Action oc-lb-agent_start_0 (15) confirmed on sc-node-2 (rc=0)
Oct 29 13:02:15 [1021] sc-node-2       crmd:   notice: te_rsc_command:  
Initiating action 16: monitor oc-lb-agent_monitor_15000 on sc-node-2 (local)
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: do_lrm_rsc_op:   
Performing key=16:0:0:419f8af1-d7a7-4913-b197-2d89ff519e66 
op=oc-lb-agent_monitor_15000
Oct 29 13:02:15 [1021] sc-node-2       crmd:   notice: te_rsc_command:  
Initiating action 17: start oc-vpn-agent_start_0 on sc-node-2 (local)
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: get_first_instance:      
Result: /com/ubuntu/Upstart/jobs/oc_2dlb_2dagent/_
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: upstart_job_property:    
Calling GetAll on /com/ubuntu/Upstart/jobs/oc_2dlb_2dagent/_
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: upstart_job_property:    
Call to GetAll passed: type '(a{sv})' 1
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: upstart_job_property:    
Got value 'running' for /com/ubuntu/Upstart/jobs/oc_2dlb_2dagent/_[state]
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: upstart_job_running:     
State of oc-lb-agent: running
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: upstart_job_running:     
oc-lb-agent is running
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: do_lrm_rsc_op:   
Performing key=17:0:0:419f8af1-d7a7-4913-b197-2d89ff519e66 
op=oc-vpn-agent_start_0
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: log_execute:     
executing - rsc:oc-vpn-agent action:start call_id:51
Oct 29 13:02:15 [1021] sc-node-2       crmd:   notice: process_lrm_event:       
LRM operation oc-lb-agent_monitor_15000 (call=49, rc=0, cib-update=35, 
confirmed=false) ok
Oct 29 13:02:15 [1016] sc-node-2        cib:     info: crm_client_new:  
Connecting 0x7f02b54c2890 for uid=0 gid=0 pid=1439 
id=96b0787a-1720-403f-bdb6-647988cbe4b7
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: upstart_job_exec_done:   
Call to start passed: type '(o)' /com/ubuntu/Upstart/jobs/oc_2dvpn_2dagent/_
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: log_finished:    
finished - rsc:oc-vpn-agent action:start call_id:51  exit-code:0 exec-time:15ms 
queue-time:0ms
Oct 29 13:02:15 [1021] sc-node-2       crmd:   notice: process_lrm_event:       
LRM operation oc-vpn-agent_start_0 (call=51, rc=0, cib-update=36, 
confirmed=true) ok
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: match_graph_event:       
Action oc-lb-agent_monitor_15000 (16) confirmed on sc-node-2 (rc=0)
Oct 29 13:02:15 [1016] sc-node-2        cib:     info: cib_process_request:     
Completed cib_modify operation for section status: OK (rc=0, 
origin=local/crmd/35, version=0.11.16)
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: match_graph_event:       
Action oc-vpn-agent_start_0 (17) confirmed on sc-node-2 (rc=0)
Oct 29 13:02:15 [1021] sc-node-2       crmd:   notice: te_rsc_command:  
Initiating action 18: monitor oc-vpn-agent_monitor_15000 on sc-node-2 (local)
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: do_lrm_rsc_op:   
Performing key=18:0:0:419f8af1-d7a7-4913-b197-2d89ff519e66 
op=oc-vpn-agent_monitor_15000
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: get_first_instance:      
Result: /com/ubuntu/Upstart/jobs/oc_2dvpn_2dagent/_
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: upstart_job_property:    
Calling GetAll on /com/ubuntu/Upstart/jobs/oc_2dvpn_2dagent/_
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: upstart_job_property:    
Call to GetAll passed: type '(a{sv})' 1
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: upstart_job_property:    
Got value 'running' for /com/ubuntu/Upstart/jobs/oc_2dvpn_2dagent/_[state]
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: upstart_job_running:     
State of oc-vpn-agent: running
Oct 29 13:02:15 [1018] sc-node-2       lrmd:     info: upstart_job_running:     
oc-vpn-agent is running
Oct 29 13:02:15 [1021] sc-node-2       crmd:   notice: process_lrm_event:       
LRM operation oc-vpn-agent_monitor_15000 (call=55, rc=0, cib-update=37, 
confirmed=false) ok
Oct 29 13:02:15 [1016] sc-node-2        cib:     info: cib_process_request:     
Completed cib_modify operation for section status: OK (rc=0, 
origin=local/crmd/36, version=0.11.17)
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: match_graph_event:       
Action oc-vpn-agent_monitor_15000 (18) confirmed on sc-node-2 (rc=0)
Oct 29 13:02:15 [1021] sc-node-2       crmd:   notice: run_graph:       
Transition 0 (Complete=19, Pending=0, Fired=0, Skipped=0, Incomplete=0, 
Source=/var/lib/pacemaker/pengine/pe-input-340.bz2): Complete
Oct 29 13:02:15 [1021] sc-node-2       crmd:     info: do_log:  FSA: Input 
I_TE_SUCCESS from notify_crmd() received in state S_TRANSITION_ENGINE
Oct 29 13:02:15 [1021] sc-node-2       crmd:   notice: do_state_transition:     
State transition S_TRANSITION_ENGINE -> S_IDLE [ input=I_TE_SUCCESS 
cause=C_FSA_INTERNAL origin=notify_crmd ]
Oct 29 13:02:15 [1016] sc-node-2        cib:     info: cib_process_request:     
Completed cib_modify operation for section status: OK (rc=0, 
origin=local/crmd/37, version=0.11.18)
Oct 29 13:02:15 [1016] sc-node-2        cib:     info: cib_process_request:     
Completed cib_query operation for section 'all': OK (rc=0, 
origin=local/crm_mon/2, version=0.11.18)
Oct 29 13:02:15 [1016] sc-node-2        cib:     info: crm_client_destroy:      
Destroying 0 events
Oct 29 13:02:17 [1016] sc-node-2        cib:     info: crm_client_new:  
Connecting 0x7f02b54c2890 for uid=0 gid=0 pid=1463 
id=b07521fd-f2bb-4267-8c2d-3ced76636afe
Oct 29 13:02:17 [1016] sc-node-2        cib:     info: cib_process_request:     
Completed cib_query operation for section 'all': OK (rc=0, 
origin=local/crm_mon/2, version=0.11.18)
Oct 29 13:02:17 [1016] sc-node-2        cib:     info: crm_client_destroy:      
Destroying 0 events

Oct 29 13:10:03 [1023] sc-node-2        cib:     info: crm_client_destroy:      
Destroying 0 events
Oct 29 13:10:05 [1023] sc-node-2        cib:     info: crm_client_new:  
Connecting 0x7f890dc4ad00 for uid=0 gid=0 pid=1291 
id=43b404f1-46da-48a4-b0d2-94ada6bf98eb
Oct 29 13:10:05 [1023] sc-node-2        cib:     info: cib_process_request:     
Completed cib_query operation for section 'all': OK (rc=0, 
origin=local/crm_mon/2, version=0.11.0)
Oct 29 13:10:05 [1023] sc-node-2        cib:     info: crm_client_destroy:      
Destroying 0 events
Oct 29 13:10:06 [1028] sc-node-2       crmd:     info: crm_timer_popped:        
Election Trigger (I_DC_TIMEOUT) just popped (20000ms)
Oct 29 13:10:06 [1028] sc-node-2       crmd:  warning: do_log:  FSA: Input 
I_DC_TIMEOUT from crm_timer_popped() received in state S_PENDING
Oct 29 13:10:06 [1028] sc-node-2       crmd:     info: do_state_transition:     
State transition S_PENDING -> S_ELECTION [ input=I_DC_TIMEOUT 
cause=C_TIMER_POPPED origin=crm_timer_popped ]
Oct 29 13:10:06 [1028] sc-node-2       crmd:     info: do_log:  FSA: Input 
I_ELECTION_DC from do_election_check() received in state S_ELECTION
Oct 29 13:10:06 [1028] sc-node-2       crmd:   notice: do_state_transition:     
State transition S_ELECTION -> S_INTEGRATION [ input=I_ELECTION_DC 
cause=C_FSA_INTERNAL origin=do_election_check ]
Oct 29 13:10:06 [1028] sc-node-2       crmd:     info: do_te_control:   
Registering TE UUID: e130e80d-75e0-416f-ac5a-d318fcdd6ec2
Oct 29 13:10:06 [1028] sc-node-2       crmd:     info: set_graph_functions:     
Setting custom graph functions
Oct 29 13:10:06 [1027] sc-node-2    pengine:     info: crm_client_new:  
Connecting 0x7f5a24271a60 for uid=106 gid=113 pid=1028 
id=03ff4b05-5b82-41b8-a5fc-71025419cd50
Oct 29 13:10:06 [1028] sc-node-2       crmd:     info: do_dc_takeover:  Taking 
over DC status for this partition
Oct 29 13:10:06 [1023] sc-node-2        cib:     info: cib_process_readwrite:   
We are now in R/W mode
Oct 29 13:10:06 [1023] sc-node-2        cib:     info: cib_process_request:     
Completed cib_master operation for section 'all': OK (rc=0, 
origin=local/crmd/6, version=0.11.0)
Oct 29 13:10:06 [1023] sc-node-2        cib:     info: cib_process_request:     
Completed cib_modify operation for section cib: OK (rc=0, origin=local/crmd/7, 
version=0.11.0)
Oct 29 13:10:06 [1023] sc-node-2        cib:     info: cib_process_request:     
Completed cib_query operation for section 
//cib/configuration/crm_config//cluster_property_set//nvpair[@name='dc-version']:
 OK (rc=0, origin=local/crmd/8, version=0.11.0)
Oct 29 13:10:06 [1023] sc-node-2        cib:     info: cib_process_request:     
Completed cib_modify operation for section crm_config: OK (rc=0, 
origin=local/crmd/9, version=0.11.0)
Oct 29 13:10:06 [1023] sc-node-2        cib:     info: cib_process_request:     
Completed cib_query operation for section 
//cib/configuration/crm_config//cluster_property_set//nvpair[@name='cluster-infrastructure']:
 OK (rc=0, origin=local/crmd/10, version=0.11.0)
Oct 29 13:10:06 [1028] sc-node-2       crmd:     info: join_make_offer:         
Making join offers based on membership 2088
Oct 29 13:10:06 [1028] sc-node-2       crmd:     info: join_make_offer:         
join-1: Sending offer to sc-node-2
Oct 29 13:10:06 [1028] sc-node-2       crmd:     info: crm_update_peer_join:    
join_make_offer: Node sc-node-2[2] - join-1 phase 0 -> 1
Oct 29 13:10:06 [1028] sc-node-2       crmd:     info: do_dc_join_offer_all:    
join-1: Waiting on 1 outstanding join acks
Oct 29 13:10:06 [1023] sc-node-2        cib:     info: cib_process_request:     
Completed cib_modify operation for section crm_config: OK (rc=0, 
origin=local/crmd/11, version=0.11.0)
Oct 29 13:10:06 [1023] sc-node-2        cib:     info: cib_process_request:     
Completed cib_query operation for section crm_config: OK (rc=0, 
origin=local/crmd/12, version=0.11.0)
Oct 29 13:10:06 [1028] sc-node-2       crmd:     info: update_dc:       Set DC 
to sc-node-2 (3.0.7)
Oct 29 13:10:06 [1023] sc-node-2        cib:     info: cib_process_request:     
Completed cib_query operation for section 'all': OK (rc=0, 
origin=local/crmd/13, version=0.11.0)
Oct 29 13:10:06 [1028] sc-node-2       crmd:     info: crm_update_peer_join:    
do_dc_join_filter_offer: Node sc-node-2[2] - join-1 phase 1 -> 2
Oct 29 13:10:06 [1028] sc-node-2       crmd:     info: 
crm_update_peer_expected:        do_dc_join_filter_offer: Node sc-node-2[2] - 
expected state is now member
Oct 29 13:10:06 [1028] sc-node-2       crmd:     info: do_state_transition:     
State transition S_INTEGRATION -> S_FINALIZE_JOIN [ input=I_INTEGRATED 
cause=C_FSA_INTERNAL origin=check_join_state ]
Oct 29 13:10:06 [1028] sc-node-2       crmd:     info: crmd_join_phase_log:     
join-1: sc-node-2=integrated
Oct 29 13:10:06 [1028] sc-node-2       crmd:     info: do_dc_join_finalize:     
join-1: Syncing our CIB to the rest of the cluster
Oct 29 13:10:06 [1023] sc-node-2        cib:   notice: corosync_node_name:      
Unable to get node name for nodeid 2
Oct 29 13:10:06 [1023] sc-node-2        cib:   notice: get_node_name:   
Defaulting to uname -n for the local corosync node name
Oct 29 13:10:06 [1023] sc-node-2        cib:     info: cib_process_request:     
Completed cib_sync operation for section 'all': OK (rc=0, origin=local/crmd/14, 
version=0.11.0)
Oct 29 13:10:06 [1028] sc-node-2       crmd:     info: crm_update_peer_join:    
finalize_join_for: Node sc-node-2[2] - join-1 phase 2 -> 3
Oct 29 13:10:06 [1023] sc-node-2        cib:     info: cib_process_request:     
Completed cib_modify operation for section nodes: OK (rc=0, 
origin=local/crmd/15, version=0.11.0)
Oct 29 13:10:06 [1028] sc-node-2       crmd:     info: erase_status_tag:        
Deleting xpath: //node_state[@uname='sc-node-2']/transient_attributes
Oct 29 13:10:06 [1028] sc-node-2       crmd:     info: update_attrd:    
Connecting to attrd... 5 retries remaining
Oct 29 13:10:06 [1023] sc-node-2        cib:     info: cib_process_request:     
Completed cib_delete operation for section 
//node_state[@uname='sc-node-2']/transient_attributes: OK (rc=0, 
origin=local/crmd/16, version=0.11.0)
Oct 29 13:10:06 [1028] sc-node-2       crmd:     info: crm_update_peer_join:    
do_dc_join_ack: Node sc-node-2[2] - join-1 phase 3 -> 4
Oct 29 13:10:06 [1028] sc-node-2       crmd:     info: do_dc_join_ack:  join-1: 
Updating node state to member for sc-node-2
Oct 29 13:10:06 [1028] sc-node-2       crmd:     info: erase_status_tag:        
Deleting xpath: //node_state[@uname='sc-node-2']/lrm
Oct 29 13:10:06 [1023] sc-node-2        cib:     info: cib_process_request:     
Completed cib_delete operation for section 
//node_state[@uname='sc-node-2']/lrm: OK (rc=0, origin=local/crmd/17, 
version=0.11.0)
Oct 29 13:10:06 [1023] sc-node-2        cib:     info: cib_process_request:     
Completed cib_modify operation for section status: OK (rc=0, 
origin=local/crmd/18, version=0.11.1)
Oct 29 13:10:06 [1028] sc-node-2       crmd:     info: do_state_transition:     
State transition S_FINALIZE_JOIN -> S_POLICY_ENGINE [ input=I_FINALIZED 
cause=C_FSA_INTERNAL origin=check_join_state ]
Oct 29 13:10:06 [1028] sc-node-2       crmd:     info: abort_transition_graph:  
do_te_invoke:151 - Triggered transition abort (complete=1) : Peer Cancelled
Oct 29 13:10:06 [1026] sc-node-2      attrd:   notice: attrd_local_callback:    
Sending full refresh (origin=crmd)
Oct 29 13:10:06 [1023] sc-node-2        cib:     info: cib_process_request:     
Completed cib_modify operation for section nodes: OK (rc=0, 
origin=local/crmd/19, version=0.11.1)
Oct 29 13:10:06 [1023] sc-node-2        cib:     info: cib_process_request:     
Completed cib_modify operation for section status: OK (rc=0, 
origin=local/crmd/20, version=0.11.1)
Oct 29 13:10:06 [1023] sc-node-2        cib:     info: cib_process_request:     
Completed cib_modify operation for section cib: OK (rc=0, origin=local/crmd/21, 
version=0.11.2)
Oct 29 13:10:06 [1023] sc-node-2        cib:     info: cib_process_request:     
Completed cib_query operation for section 'all': OK (rc=0, 
origin=local/crmd/22, version=0.11.2)
Oct 29 13:10:06 [1027] sc-node-2    pengine:     info: determine_online_status: 
        Node sc-node-2 is online
Oct 29 13:10:06 [1027] sc-node-2    pengine:     info: group_print:      
Resource Group: sc-resources
Oct 29 13:10:06 [1027] sc-node-2    pengine:     info: native_print:         
sc_vip     (ocf::heartbeat:IPaddr2):       Stopped 
Oct 29 13:10:06 [1027] sc-node-2    pengine:     info: native_print:         
oc-service-manager (upstart:oc-service-manager):   Stopped 
Oct 29 13:10:06 [1027] sc-node-2    pengine:     info: native_print:         
oc-fw-agent        (upstart:oc-fw-agent):  Stopped 
Oct 29 13:10:06 [1027] sc-node-2    pengine:     info: native_print:         
oc-lb-agent        (upstart:oc-lb-agent):  Stopped 
Oct 29 13:10:06 [1027] sc-node-2    pengine:     info: native_print:         
oc-vpn-agent       (upstart:oc-vpn-agent): Stopped 
Oct 29 13:10:06 [1027] sc-node-2    pengine:     info: RecurringOp:      Start 
recurring monitor (15s) for sc_vip on sc-node-2
Oct 29 13:10:06 [1027] sc-node-2    pengine:     info: RecurringOp:      Start 
recurring monitor (15s) for oc-service-manager on sc-node-2
Oct 29 13:10:06 [1027] sc-node-2    pengine:     info: RecurringOp:      Start 
recurring monitor (15s) for oc-fw-agent on sc-node-2
Oct 29 13:10:06 [1027] sc-node-2    pengine:     info: RecurringOp:      Start 
recurring monitor (15s) for oc-lb-agent on sc-node-2
Oct 29 13:10:06 [1027] sc-node-2    pengine:     info: RecurringOp:      Start 
recurring monitor (15s) for oc-vpn-agent on sc-node-2
Oct 29 13:10:06 [1027] sc-node-2    pengine:   notice: LogActions:      Start   
sc_vip  (sc-node-2)
Oct 29 13:10:06 [1027] sc-node-2    pengine:   notice: LogActions:      Start   
oc-service-manager      (sc-node-2)
Oct 29 13:10:06 [1027] sc-node-2    pengine:   notice: LogActions:      Start   
oc-fw-agent     (sc-node-2)
Oct 29 13:10:06 [1027] sc-node-2    pengine:   notice: LogActions:      Start   
oc-lb-agent     (sc-node-2)
Oct 29 13:10:06 [1027] sc-node-2    pengine:   notice: LogActions:      Start   
oc-vpn-agent    (sc-node-2)
Oct 29 13:10:06 [1028] sc-node-2       crmd:     info: do_state_transition:     
State transition S_POLICY_ENGINE -> S_TRANSITION_ENGINE [ input=I_PE_SUCCESS 
cause=C_IPC_MESSAGE origin=handle_response ]
Oct 29 13:10:06 [1028] sc-node-2       crmd:     info: do_te_invoke:    
Processing graph 0 (ref=pe_calc-dc-1446138606-7) derived from 
/var/lib/pacemaker/pengine/pe-input-347.bz2
Oct 29 13:10:06 [1027] sc-node-2    pengine:   notice: process_pe_message:      
Calculated Transition 0: /var/lib/pacemaker/pengine/pe-input-347.bz2
Oct 29 13:10:06 [1028] sc-node-2       crmd:   notice: te_rsc_command:  
Initiating action 4: monitor sc_vip_monitor_0 on sc-node-2 (local)
Oct 29 13:10:06 [1025] sc-node-2       lrmd:     info: 
process_lrmd_get_rsc_info:       Resource 'sc_vip' not found (0 active 
resources)
Oct 29 13:10:06 [1025] sc-node-2       lrmd:     info: 
process_lrmd_rsc_register:       Added 'sc_vip' to the rsc list (1 active 
resources)
Oct 29 13:10:06 [1028] sc-node-2       crmd:     info: do_lrm_rsc_op:   
Performing key=4:0:7:e130e80d-75e0-416f-ac5a-d318fcdd6ec2 op=sc_vip_monitor_0
Oct 29 13:10:06 [1028] sc-node-2       crmd:   notice: te_rsc_command:  
Initiating action 5: monitor oc-service-manager_monitor_0 on sc-node-2 (local)
Oct 29 13:10:06 [1025] sc-node-2       lrmd:     info: 
process_lrmd_get_rsc_info:       Resource 'oc-service-manager' not found (1 
active resources)
Oct 29 13:10:06 [1025] sc-node-2       lrmd:     info: 
process_lrmd_rsc_register:       Added 'oc-service-manager' to the rsc list (2 
active resources)
Oct 29 13:10:06 [1028] sc-node-2       crmd:     info: do_lrm_rsc_op:   
Performing key=5:0:7:e130e80d-75e0-416f-ac5a-d318fcdd6ec2 
op=oc-service-manager_monitor_0
Oct 29 13:10:06 [1028] sc-node-2       crmd:   notice: te_rsc_command:  
Initiating action 6: monitor oc-fw-agent_monitor_0 on sc-node-2 (local)
Oct 29 13:10:06 [1025] sc-node-2       lrmd:     info: get_first_instance:      
Result: (null)
Oct 29 13:10:06 [1025] sc-node-2       lrmd:     info: upstart_job_running:     
oc-service-manager is not running
Oct 29 13:10:06 [1025] sc-node-2       lrmd:     info: 
process_lrmd_get_rsc_info:       Resource 'oc-fw-agent' not found (2 active 
resources)
Oct 29 13:10:06 [1025] sc-node-2       lrmd:     info: 
process_lrmd_rsc_register:       Added 'oc-fw-agent' to the rsc list (3 active 
resources)
Oct 29 13:10:06 [1028] sc-node-2       crmd:     info: do_lrm_rsc_op:   
Performing key=6:0:7:e130e80d-75e0-416f-ac5a-d318fcdd6ec2 
op=oc-fw-agent_monitor_0
Oct 29 13:10:06 [1028] sc-node-2       crmd:   notice: te_rsc_command:  
Initiating action 7: monitor oc-lb-agent_monitor_0 on sc-node-2 (local)
Oct 29 13:10:06 [1025] sc-node-2       lrmd:     info: get_first_instance:      
Result: (null)
Oct 29 13:10:06 [1025] sc-node-2       lrmd:     info: upstart_job_running:     
oc-fw-agent is not running
Oct 29 13:10:06 [1025] sc-node-2       lrmd:     info: 
process_lrmd_get_rsc_info:       Resource 'oc-lb-agent' not found (3 active 
resources)
Oct 29 13:10:06 [1025] sc-node-2       lrmd:     info: 
process_lrmd_rsc_register:       Added 'oc-lb-agent' to the rsc list (4 active 
resources)
Oct 29 13:10:06 [1028] sc-node-2       crmd:     info: do_lrm_rsc_op:   
Performing key=7:0:7:e130e80d-75e0-416f-ac5a-d318fcdd6ec2 
op=oc-lb-agent_monitor_0
Oct 29 13:10:06 [1028] sc-node-2       crmd:   notice: te_rsc_command:  
Initiating action 8: monitor oc-vpn-agent_monitor_0 on sc-node-2 (local)
Oct 29 13:10:06 [1025] sc-node-2       lrmd:     info: get_first_instance:      
Result: (null)
Oct 29 13:10:06 [1025] sc-node-2       lrmd:     info: upstart_job_running:     
oc-lb-agent is not running
Oct 29 13:10:06 [1025] sc-node-2       lrmd:     info: 
process_lrmd_get_rsc_info:       Resource 'oc-vpn-agent' not found (4 active 
resources)
Oct 29 13:10:06 [1025] sc-node-2       lrmd:     info: 
process_lrmd_rsc_register:       Added 'oc-vpn-agent' to the rsc list (5 active 
resources)
Oct 29 13:10:06 [1028] sc-node-2       crmd:     info: do_lrm_rsc_op:   
Performing key=8:0:7:e130e80d-75e0-416f-ac5a-d318fcdd6ec2 
op=oc-vpn-agent_monitor_0
Oct 29 13:10:06 [1025] sc-node-2       lrmd:     info: get_first_instance:      
Result: (null)
Oct 29 13:10:06 [1025] sc-node-2       lrmd:     info: upstart_job_running:     
oc-vpn-agent is not running
Oct 29 13:10:06 [1023] sc-node-2        cib:     info: cib_process_request:     
Completed cib_modify operation for section status: OK (rc=0, 
origin=local/crmd/23, version=0.11.3)
Oct 29 13:10:06 [1028] sc-node-2       crmd:   notice: process_lrm_event:       
LRM operation oc-service-manager_monitor_0 (call=9, rc=7, cib-update=23, 
confirmed=true) not running
Oct 29 13:10:06 [1028] sc-node-2       crmd:   notice: process_lrm_event:       
LRM operation oc-fw-agent_monitor_0 (call=13, rc=7, cib-update=24, 
confirmed=true) not running
Oct 29 13:10:06 [1023] sc-node-2        cib:     info: cib_process_request:     
Completed cib_modify operation for section status: OK (rc=0, 
origin=local/crmd/24, version=0.11.4)
Oct 29 13:10:06 [1028] sc-node-2       crmd:   notice: process_lrm_event:       
LRM operation oc-lb-agent_monitor_0 (call=17, rc=7, cib-update=25, 
confirmed=true) not running
Oct 29 13:10:06 [1023] sc-node-2        cib:     info: cib_process_request:     
Completed cib_modify operation for section status: OK (rc=0, 
origin=local/crmd/25, version=0.11.5)
Oct 29 13:10:06 [1028] sc-node-2       crmd:   notice: process_lrm_event:       
LRM operation oc-vpn-agent_monitor_0 (call=21, rc=7, cib-update=26, 
confirmed=true) not running
Oct 29 13:10:06 [1028] sc-node-2       crmd:     info: match_graph_event:       
Action oc-service-manager_monitor_0 (5) confirmed on sc-node-2 (rc=0)
Oct 29 13:10:06 [1028] sc-node-2       crmd:     info: match_graph_event:       
Action oc-fw-agent_monitor_0 (6) confirmed on sc-node-2 (rc=0)
Oct 29 13:10:06 [1028] sc-node-2       crmd:     info: match_graph_event:       
Action oc-lb-agent_monitor_0 (7) confirmed on sc-node-2 (rc=0)
Oct 29 13:10:06 [1023] sc-node-2        cib:     info: cib_process_request:     
Completed cib_modify operation for section status: OK (rc=0, 
origin=local/crmd/26, version=0.11.6)
Oct 29 13:10:07 [1023] sc-node-2        cib:     info: crm_client_new:  
Connecting 0x7f890dc5c0b0 for uid=0 gid=0 pid=1360 
id=bb7daa59-f0d5-44ea-99a1-360169ea9b78
Oct 29 13:10:07 [1023] sc-node-2        cib:     info: cib_process_request:     
Completed cib_query operation for section 'all': OK (rc=0, 
origin=local/crm_mon/2, version=0.11.6)
Oct 29 13:10:07 [1023] sc-node-2        cib:     info: crm_client_destroy:      
Destroying 0 events
Oct 29 13:10:09 [1023] sc-node-2        cib:     info: crm_client_new:  
Connecting 0x7f890dc5c0b0 for uid=0 gid=0 pid=1363 
id=2da2d19d-b56d-405c-a936-c287aec2cb24
Oct 29 13:10:09 [1023] sc-node-2        cib:     info: cib_process_request:     
Completed cib_query operation for section 'all': OK (rc=0, 
origin=local/crm_mon/2, version=0.11.6)
Oct 29 13:10:09 [1023] sc-node-2        cib:     info: crm_client_destroy:      
Destroying 0 events
Oct 29 13:10:11 [1023] sc-node-2        cib:     info: crm_client_new:  
Connecting 0x7f890dc5c0b0 for uid=0 gid=0 pid=1366 
id=8e5d0619-1d42-47a2-93db-8d2f63476f85
Oct 29 13:10:11 [1023] sc-node-2        cib:     info: cib_process_request:     
Completed cib_query operation for section 'all': OK (rc=0, 
origin=local/crm_mon/2, version=0.11.6)
_______________________________________________
Users mailing list: Users@clusterlabs.org
http://clusterlabs.org/mailman/listinfo/users

Project Home: http://www.clusterlabs.org
Getting started: http://www.clusterlabs.org/doc/Cluster_from_Scratch.pdf
Bugs: http://bugs.clusterlabs.org

Reply via email to