Am 10.08.2015 um 19:20 schrieb Antonio Scuri: > Well, I really don't know. > > Just know that the calls are not actually "active all at the same time". > When you call IupFlush in one of them, it will stop that one, processes the > pending messages one by one, then return to that one. If that can occur > recursively then it is a real nightmare to manage. You will have a hard > time to know what is the order that things are happening.
Ah' I see. Maybe it's a good idea to add a remark pointing this out in the manual. Me, reading the manual, gathered that calling IupFlush would never hurt, just advance the processing at the Iup side of things. > I would recommend you to get a step back are rethink that strategy. So far this never became a strategy. It failed straight in the first experiment. It's just my missunderstanding of the docs and an experiment. > Independent from IUP, or other toolkits, what you are trying to implement? There is a very simple RDF/DublinCore inspired inventory application for a small archive (archive of physical things, kind of a museum - not the family of file formats). I'm using it as "personal wiki" for my notes too. The original is a web app. I'm trying replace the Web-Interface with a native GUI. /Jörg > Best, > Scuri > > > On Sat, Aug 8, 2015 at 9:58 AM, "Jörg F. Wittenberger" < > [email protected]> wrote: > >> ((Sorry, this is not exactly the message I want to reply to, but at >> least it's the correct thread. (The other message is already gone here.))) >> >> I managed to understand the problem a little better. >> >> Maybe I'm dong something against the philosophy of correct Iup usage here? >> >> What's happened is that the action callback from my apps back-button is >> called as often as I click it. Eventually my action callback will also >> call IupFlush. (Is this a No-No?) This IupFlush in turn enables the >> next action callback to be activated. Thus there are 4-5 calls active >> at the same time. >> >> Each of those calls does essentially the same: recreate scrollbox full >> of controls (from different data sets). Then detach the (only) child of >> the target container, IupDestroy that detached child, IupAppend the new >> one and IupRefresh the container. >> >> Since the data set is different, those action callback take a different >> amount of time (while waiting for the data base etc.). >> >> By throwing in some additional calls to IupFlush into the code (I'd >> believe this _should_ not hurt, shouldn't it?) I've been able to get it >> all mixed up. Up to the point that controls belonging to different data >> sets / callbacks ended up mixed into the same container. >> >> To put it in other words: those nested action callbacks in combination >> with IupFlush enabled me to (accidentally) implement a kind of green >> threads on top of Iup. Unfortunately without proper locking. >> >> Having too few IupFlush's there will result in the segfault. More >> flushes make it appear to work. Even more flushes confuse the layout >> logic. >> >> How should this being done right? >> >> Thanks >> >> /Jörg >> >> Am 05.08.2015 um 10:50 schrieb "Jörg F. Wittenberger": >>> Hi Antonio, >>> >>> during IupFlush my program dumped a core like this: >>> >>> Program terminated with signal SIGSEGV, Segmentation fault. >>> #0 0x4097693a in gtkCanvasSetDXAttrib.part.3 () from >>> /usr/local/lib/libiup.so >>> (gdb) bt >>> #0 0x4097693a in gtkCanvasSetDXAttrib.part.3 () from >>> /usr/local/lib/libiup.so >>> #1 0x00000000 in ?? () >>> >>> >>> Anything I can do to help debugging that one? >>> >>> Best >>> >>> /Jörg >> >> >> >> ------------------------------------------------------------------------------ >> _______________________________________________ >> Iup-users mailing list >> [email protected] >> https://lists.sourceforge.net/lists/listinfo/iup-users >> > > > > ------------------------------------------------------------------------------ > > > > _______________________________________________ > Iup-users mailing list > [email protected] > https://lists.sourceforge.net/lists/listinfo/iup-users > ------------------------------------------------------------------------------ _______________________________________________ Iup-users mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/iup-users
