Another approach which would be a *lot* easier to implement would be to add a 
Layer public API that would allow the app developer to specify what layers 
there are, and where they are (this has a lot of other benefits related to 
performance on embedded and mobile devices). Then, don't solve the heavyweight 
/ lightweight issue, but allow heavyweight native controls to be added 
wherever. Then you just be careful not to have any lightweights on the same 
layer as a heavyweight if they might overlap above the heavyweight.

Of course, you have this API already today in the form of SubScene! I haven't 
tried it in this context though.

Richard

On Oct 22, 2013, at 6:21 AM, David Ray <cognitionmiss...@gmail.com> wrote:

> +1 re: Native L&F.  IMO also there is nothing sacred about the "exactness" of 
> Apple's ui. They 'll be changing it up a lot a also. Being someone who 
> prefers custom looks to bland native looks anyway, I never did get the 
> "sacredness" of repeating "mirror-lookalike" grey :). Just my opinion, I'm 
> sure there are those who disagree.
> 
> David
> 
> Sent from my iPhone
> 
>> On Oct 22, 2013, at 7:17 AM, Tobias Bley <t...@ultramixer.com> wrote:
>> 
>> 1) Look and Feel:
>> 
>> IMO it’s enough to build „native looking“ css based skins! That could be 
>> very quickly without complex technologies like CALayer etc.
>> 
>> 2) After starting RoboVM JavaFX needs round about 10 seconds to start the 
>> simple list example on iPhone4. So it’s too long. I tried to use the 
>> preloaded class via property „-Djavafx.preloader“ but it doesn’t work, my 
>> preloaded class is not instantiated…
>> 
>> Tobi
>> 
>> 
>> 
>> Am 21.10.2013 um 21:48 schrieb Richard Bair <richard.b...@oracle.com>:
>> 
>>>> 1. Can you provide me with a detailed summary of where the iOS/Android
>>>> ports are currently?  This includes the platform specific stuff to make
>>>> either RoboVM or an Oracle JVM work?
>>> 
>>> I would say, it is in a good prototype stage. It hasn't had heavy testing, 
>>> so the biggest risk is the unknown. Luckily, on iOS at least there are only 
>>> a very few devices so it should be relatively easy for an app developer to 
>>> feel confident it works for their app. But for the OpenJFX project, to test 
>>> and certify a whole variety of APIs, will be quite a challenge. We have a 
>>> huge pile of tests, we just need:
>>>  1) To have a way to run our unit tests on the actual devices
>>>  2) Have a way to run graphical tests on devices (basically a port of 
>>> Jemmy??)
>>> 
>>> I haven't scoped either of these efforts, but both are ripe areas for 
>>> collaboration with anybody in the community.
>>> 
>>> If it were heavily tested, I'd say most of the remaining work is actually 
>>> in the graphics / performance side. Path rendering performance is fairly 
>>> bad (though I've seen similar complaints for Cocoa path rendering 
>>> performance, so it may be we're not that "bad" relatively speaking, but it 
>>> is still horrendous IMO and needs to be looked at).
>>> 
>>> The code is all there for the integration layer -- anybody with familiarity 
>>> with Objective-C and Cocoa, I'd say, go read the glass code! This is a huge 
>>> opportunity for community initial community involvement because:
>>>  1) There is a ton of existing documentation and code in the Apple universe 
>>> describing all the sorts of things we need to do in Glass
>>>  2) Glass is pretty decoupled from the rest of the platform, so you can 
>>> easily understand what is going on there without having to understand 
>>> everything else
>>> 
>>> Contributing on the Graphics side is more work and requires more 
>>> specialized skills. The fortunate thing here is that the graphics code is 
>>> shared with embedded (and desktop Mac and Linux) so there is a lot of 
>>> overlap.
>>> 
>>> So those would be the main things from my perspective: performance testing, 
>>> functional / unit testing, native platform integration, and graphics.
>>> 
>>> Another thing we've designed for from the beginning, but never validated, 
>>> is the ability to use a native control as the skin. The iOS prototype "hot 
>>> swaps" a native text field in when a TextInputControl gets focus, but this 
>>> is kinda funky (and there are lots of bugs here). The "right" thing to do 
>>> here would be to have a set of native skins for controls, and the ability 
>>> to create multiple core layers. So if you have a scene graph where on the 
>>> bottom you draw stuff, then have a native control, then draw more stuff 
>>> over the native control, then what you would want is, on the Prism side, 
>>> use 3 hardware layers (one for below the native, one for the native, and 
>>> one for above the native). I don't know:
>>>  1) How well this would work in practice with input events (but one could 
>>> imagine putting a native 'glass pane' on the top to intercept all events 
>>> and vector them through FX, forwarding to native controls as necessary as 
>>> part of the normal FX event dispatch) 
>>>  2) How many layers you could have before things would explode.
>>> 
>>> Alternatively, using image skinning to get the look right, and only do the 
>>> funky native control swap in for text input, like we're doing today.
>>> 
>>>> 2. Are the iOS and Android ports at roughly the same level of
>>>> completeness/viability?
>>> 
>>> I think so, David / Tomas, can you guys add some insight as to where we're 
>>> at from your perspective?
>>> 
>>>> 3. Exactly what is left in making these ports viable?  Here the word
>>>> "viable" is defined in my 6 Degrees of Separation post here
>>>> http://justmy2bits.com/2013/09/30/javafx-on-ios-and-android-six-degrees-of-separation
>>> 
>>> 1. Their look and feel is indistinguishable from native apps
>>> 
>>> As described above, there is work to be done here either by beefing up the 
>>> emulation, or adding the ability to intersperse native controls. The latter 
>>> approach I expect to be significant work, although technically feasible.
>>> 
>>> Also, I expect people will want to add more iOS specific controls (like 
>>> breadcrumb bars) to make this easier to do (rather than everybody styling 
>>> their own).
>>> 
>>> 2. They must load as quickly as a native apps
>>> 
>>> I've heard RoboVM starts up very quickly. Also you will package your app 
>>> with a splash screen. Also I believe the Preloader APIs work now with iOS 
>>> (I haven't tested on RoboVM but try it out and let me know if it works! You 
>>> will probably need to launch a bit differently, providing the preloader as 
>>> a system property I think). So I expect this to work reasonably well.
>>> 
>>> 3. They must perform as well as a native apps once loaded
>>> 
>>> This is the open question. We may find the graphics to be the bottleneck, 
>>> or we may find that the CPU usage is the bottleneck. On the CPU side, one 
>>> problem may be the large number of method calls to set / get property 
>>> values. Going to a "full lazy" style for many properties on Node might help 
>>> here, for instance.
>>> 
>>> 4. They must be able to utilise all (or at least most) of the native APIs, 
>>> devices and features that native apps can utilize
>>> 
>>> I would say this is a given, since you can use JNI to invoke any API. 
>>> However if you want to embed a native widget in the app (such as iAd 
>>> banners), then we (as a community) need to solve the problem of embedding 
>>> natives in the scene graph (layers issue described above).
>>> 
>>> 5. They must be as available as native apps and available from the same 
>>> channels (e.g. iOS app store)
>>> 
>>> I think this is a given (nothing to do here, except making whatever tweaks 
>>> Apple deems necessary)
>>> 
>>> 6. They must be as easy to update as native apps and through the same 
>>> channels
>>> 
>>> Again, this is a non-issue I think because if you can submit via the app 
>>> store, then you can update via the app store.
>>> 
>>>> I know it's a pain to have to attempt to pacify the mob but I am sure I
>>>> speak on behalf of all JavaFX developers when I say that many of us have
>>>> serious financial or personal investment in JavaFX and *need* to know that
>>>> a mobile/tablet future is indeed possible.
>>> 
>>> Not a problem at all, it is important that we work together to make this a 
>>> reality. I'll spend the time necessary to help anybody who wants to get 
>>> their hands dirty with where to go in the code, how it works, and what 
>>> needs to be done!
>>> 
>>> Richard
>> 

Reply via email to