Hi!
I hope someone can help me. We are running a web based application that makes queries to an Oracle server in a remote network over a WAN with a VPN tunnel made by the routers. The problem is that the web based application is sending packets with the "don't fragment" bit on. So, when the routers encrypt the packets they can't fragment the big ones and drops them. I don't know where this fragment bit came from or who established it? The web based application was developed in-house.
I hope someone can help me. We are running a web based application that makes queries to an Oracle server in a remote network over a WAN with a VPN tunnel made by the routers. The problem is that the web based application is sending packets with the "don't fragment" bit on. So, when the routers encrypt the packets they can't fragment the big ones and drops them. I don't know where this fragment bit came from or who established it? The web based application was developed in-house.
If someone can bring me some clue, I will really appreciate
it!
Thanks for your help.
Miguel Martinez
