On 06/15/2016 05:15 PM, Balram Pariyarath wrote: > A few days back my problem was that I cannot make any calls using > telepathy (test-calls or msgs). Now I cant connect to my SIP acct. using > empathy!
I'm sure there is an explanation to everything. Check the debug logs of your connection manager and mission-control. Empathy has a UI for showing logs from mission-control and CMs that implement the Debug interface, which you can use to inspect them. So does kde-telepathy as well. > @George: can you please elaborate about call1.content and call1.stream? > Should they be implemented externally? and how? > As you probably already know, calls in telepathy are represented as channel objects that implement Channel.Type.Call1. Ch.T.Call1 cannot work on its own, though. Your CM should also expose at least one Call1.Content object which then in turn should contain at least one Call1.Stream object. Semantically, a Content represents a media content in a call (ex. audio or video). A Stream represents an actual media stream and is contained within a Content. Usually there is one Stream per Content, since you are streaming to one peer (either p2p or to a server), but there could be more streams (ex. if you were streaming to multiple peers directly without a relay server and using unicast; multicast would still be one stream, but with multiple endpoints). Now typically, protocols such as SIP and XMPP do not do the actual media streaming on their own. They are used only for signaling the properties of the media stream and the media stream uses some other protocol. In order to allow better integration with the UIs, the design of telepathy places the signaling protocol in the CM and the actual media streaming in the UI. So, when you are doing a call with empathy and tp-gabble, gabble does the signaling part, exposes the relevant information, and empathy does the actual media streaming. In order to expose this information, the CM needs to implement also the media interfaces (Call1.Content.Interface.Media & Call1.Stream.Interface.Media) on the Content & Stream objects, as well as any other relevant interfaces from this category. The UI then uses these interfaces to extract the signaling information from the CM and setup the media stream. In certain cases, however, it is not possible to separate the media streaming from the signaling. There are 2 cases there as well. The first case is when you have dedicated hardware that does the call (ex. a GSM modem on a phone). In this case the software (both the CM and the UI) have no control over the media streaming, so the CM exposes the Ch.T.Call1.HardwareStreaming=True property and the UI also knows this way that it shouldn't bother with the stream. This is not very user friendly, though, because the media controls on the UI won't be available. The user will have to deal with the device directly somehow. The second case is when you have a CM for a proprietary protocol, whose media streaming cannot be implemented in the UI for legal reasons. In this case the UI should still handle the audio/video devices and transfer the data to/from the CM, which will handle the actual encoding/decoding and streaming. This can be done by normally exposing the Media interfaces and setting the transport type [1] to SHM, which will tell the media streaming component to actually offload media data to/from a shared memory segment that the CM can use on its side to exchange this data with the UI. The reason I asked this question is because I wonder how you are implementing calls in your CM. If HardwareStreaming=False, whatever call GUI you try to use, it will expect to find the Media interfaces available, otherwise it is normal for it to fail. I hope I didn't confuse you. If you need further help, I'm mostly available for talking on irc. Regards, George [1]. https://telepathy.freedesktop.org/spec/Call_Stream_Interface_Media.html#Enum:Stream_Transport_Type _______________________________________________ telepathy mailing list telepathy@lists.freedesktop.org https://lists.freedesktop.org/mailman/listinfo/telepathy