Disabling Screen Break with subsequent Preference Screens through XML.

2009-07-14 Thread Ravi

Hi,

I need to display another PreferenceScreen through xml which occupies
the full screen.

Calling addPrefrencesFromResource(id)  with different id results in
Screen Break handled by Preference Framework.


setPreferenceScreen(mPS) could be called again & it works fine but it
is through code...How do I do it through XML?

Neither inflateFromResource() not PreferenceInflater is available so,
I can't use setPreferenceScreen for XML?

Plz do clarify.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



ACTION_TIME_TICK required for every second.

2009-06-26 Thread Ravi

ACTION_TIME_TICK occurs every minute. I need an action which will
occur every second because I'm trying to display the 'Seconds' needle
in AppWidget.

Do help me even if it means modifying the framework source code.

ACTION_TIME doesn't occurs & it's docs are not clear.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



AnalogClock AppWidget dial

2009-06-23 Thread Ravi

Using Appwidgets, If we use TextView in layout provider xml then, we
are able to set the text in Provider.java using
remoteViews.setTextViewText() method.

Similarly, if I use AnalogClock as AppWidget in xml then, is there any
method available to change the android:dial value using RemoteViews or
AnalogClock api's in Provider.java ?

For Ex- I need to load two dial png's based on AM or PM.

Please help me.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: REG: Xml to Wbxml using EAS

2009-06-19 Thread Ravi

Can atleast some assist me with some help topics?

On Jun 17, 6:50 pm, Ravi  wrote:
> Anyone thr to help me :(
>
> On Jun 17, 8:56 am, Ravi  wrote:
>
>
>
> > Hi, It's urgent, Could anyone please respond to my queries.
>
> > On Jun 16, 11:01 am, Ravi  wrote:
>
> > > Dear friends,
>
> > > I need to convert from Xml to Wbxml for Client-Server Communication
> > > using AirSynC (Microsoft Exchange Active Sync Protocol).
>
> > > I found XML to WBXML parsers available in  ../dalvik/libcore/xml/src/
> > > main/java/org/kxml2 but there was no official documentation regarding
> > > these apis.
>
> > > Inbuilt Google IM Application available in ../packages/apps/IM makes
> > > use of separate XML<-->WBXML parsers instead of using existing
> > > org.kxml2 package listed above
>
> > > and makes use of Native call for SynCML Tag Table mapping.
>
> > > My doubts are as follows:
>
> > > 1.      Why ../dalvik/libcore/xml/src/main/java/org/kxml2 docs aren’t
> > > available, is it still in development stage?
> > > 2.      Why Inbuilt IM Application has gone for Native Code and own 
> > > Parsers
> > > instead of using ../dalvik/libcore/xml/src/main/java/org/kxml2 apis?
> > > 3.      In ../dalvik/libcore/xml/src/main/java/org/ package, Support for
> > > SynCML, WV & WML. Could I get the sample Codes using these apis?
>
> > > Once I get the sample for SyncML , I need to change the XML<-->WBXML
> > > mapping based on AirSync(EAS) instead of XML.
>
> > > Please guide me.- Hide quoted text -
>
> > - Show quoted text -- Hide quoted text -
>
> - Show quoted text -
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: REG: Xml to Wbxml using EAS

2009-06-17 Thread Ravi

Anyone thr to help me :(

On Jun 17, 8:56 am, Ravi  wrote:
> Hi, It's urgent, Could anyone please respond to my queries.
>
> On Jun 16, 11:01 am, Ravi  wrote:
>
>
>
> > Dear friends,
>
> > I need to convert from Xml to Wbxml for Client-Server Communication
> > using AirSynC (Microsoft Exchange Active Sync Protocol).
>
> > I found XML to WBXML parsers available in  ../dalvik/libcore/xml/src/
> > main/java/org/kxml2 but there was no official documentation regarding
> > these apis.
>
> > Inbuilt Google IM Application available in ../packages/apps/IM makes
> > use of separate XML<-->WBXML parsers instead of using existing
> > org.kxml2 package listed above
>
> > and makes use of Native call for SynCML Tag Table mapping.
>
> > My doubts are as follows:
>
> > 1.      Why ../dalvik/libcore/xml/src/main/java/org/kxml2 docs aren’t
> > available, is it still in development stage?
> > 2.      Why Inbuilt IM Application has gone for Native Code and own Parsers
> > instead of using ../dalvik/libcore/xml/src/main/java/org/kxml2 apis?
> > 3.      In ../dalvik/libcore/xml/src/main/java/org/ package, Support for
> > SynCML, WV & WML. Could I get the sample Codes using these apis?
>
> > Once I get the sample for SyncML , I need to change the XML<-->WBXML
> > mapping based on AirSync(EAS) instead of XML.
>
> > Please guide me.- Hide quoted text -
>
> - Show quoted text -
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: REG: Xml to Wbxml using EAS

2009-06-16 Thread Ravi

Hi, It's urgent, Could anyone please respond to my queries.

On Jun 16, 11:01 am, Ravi  wrote:
> Dear friends,
>
> I need to convert from Xml to Wbxml for Client-Server Communication
> using AirSynC (Microsoft Exchange Active Sync Protocol).
>
> I found XML to WBXML parsers available in  ../dalvik/libcore/xml/src/
> main/java/org/kxml2 but there was no official documentation regarding
> these apis.
>
> Inbuilt Google IM Application available in ../packages/apps/IM makes
> use of separate XML<-->WBXML parsers instead of using existing
> org.kxml2 package listed above
>
> and makes use of Native call for SynCML Tag Table mapping.
>
> My doubts are as follows:
>
> 1.      Why ../dalvik/libcore/xml/src/main/java/org/kxml2 docs aren’t
> available, is it still in development stage?
> 2.      Why Inbuilt IM Application has gone for Native Code and own Parsers
> instead of using ../dalvik/libcore/xml/src/main/java/org/kxml2 apis?
> 3.      In ../dalvik/libcore/xml/src/main/java/org/ package, Support for
> SynCML, WV & WML. Could I get the sample Codes using these apis?
>
> Once I get the sample for SyncML , I need to change the XML<-->WBXML
> mapping based on AirSync(EAS) instead of XML.
>
> Please guide me.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: libopencorehw implementation

2009-06-16 Thread Ravi

Are you using this code for your own hardware?

Did you check if "fd" was valid? Looks like "fd" can be uninitialized.

Dave might be able to help you. I am not too familiar with the device
specific code.

-Ravi

On Jun 16, 3:18 am, Andy Quan  wrote:
> Ravi,Could you help look at line 1045 of this file?
> Line 1045:
> sp master = (MemoryHeapBase *) fd;
>
> fd is of uint32 and stands for file descriptor from OMX unit. But this line
> crashed as soon as it was reached. I guess this crash is because of
> definition of "sp".
>
> 314<http://android.git.kernel.org/?p=platform/frameworks/base.git;a=blob;...>template T>
> 315<http://android.git.kernel.org/?p=platform/frameworks/base.git;a=blob;...>sp&
> sp::operator = (T* other)
> 316<http://android.git.kernel.org/?p=platform/frameworks/base.git;a=blob;...>{
> 317<http://android.git.kernel.org/?p=platform/frameworks/base.git;a=blob;...>
>    if (other) other->incStrong(this);
> 318<http://android.git.kernel.org/?p=platform/frameworks/base.git;a=blob;...>
>    if (m_ptr) m_ptr->decStrong(this);
> 319<http://android.git.kernel.org/?p=platform/frameworks/base.git;a=blob;...>
>    m_ptr = other;
> 320<http://android.git.kernel.org/?p=platform/frameworks/base.git;a=blob;...>
>    return *this;
> 321<http://android.git.kernel.org/?p=platform/frameworks/base.git;a=blob;...>}
>
> It seems that "fd->incStrong" is called but actually fd is only a file
> descriptor instead of a refbase object...
>
> Do you have any comment on this problem? Did I misunderstand anything? Thank
> you.
>
>
>
> On Tue, Jun 16, 2009 at 1:38 PM, Ravi  wrote:
>
> > Look at the code in release-1.0.
>
> >http://android.git.kernel.org/?p=platform/external/opencore.git;a=blo...
>
> > This was definitely a working piece of code. A very close (if not the
> > same) version was used in the first device release. But the code has
> > changed quite a bit.
>
> > The code under "samples" is a version spun off from the original, and
> > is merely to demonstrate the usage.
>
> > -Ravi
>
> > On Jun 15, 11:57 pm, Andy Quan  wrote:
> > > Hi,I find there are some sample files under
> > > external/opencore/android/samples demonstrating how child class of
> > > android_surface_output should be created. However, I did not find it
> > > eventually used in open source git. So my question is that, is this
> > sample
> > > file the same as the one used in G1 or HTC release?
> > > I reused this file but there come up some problems in PMEM usage. So I
> > > wonder if this is a verified source code or simply a demonstration. Thank
> > > you.
>
> > > --
> > > Thanks,
> > > Andy
>
> --
> Thanks,
> Andy
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



REG: Xml to Wbxml using EAS

2009-06-15 Thread Ravi

Dear friends,

I need to convert from Xml to Wbxml for Client-Server Communication
using AirSynC (Microsoft Exchange Active Sync Protocol).

I found XML to WBXML parsers available in  ../dalvik/libcore/xml/src/
main/java/org/kxml2 but there was no official documentation regarding
these apis.

Inbuilt Google IM Application available in ../packages/apps/IM makes
use of separate XML<-->WBXML parsers instead of using existing
org.kxml2 package listed above

and makes use of Native call for SynCML Tag Table mapping.

My doubts are as follows:

1.  Why ../dalvik/libcore/xml/src/main/java/org/kxml2 docs aren’t
available, is it still in development stage?
2.  Why Inbuilt IM Application has gone for Native Code and own Parsers
instead of using ../dalvik/libcore/xml/src/main/java/org/kxml2 apis?
3.  In ../dalvik/libcore/xml/src/main/java/org/ package, Support for
SynCML, WV & WML. Could I get the sample Codes using these apis?

Once I get the sample for SyncML , I need to change the XML<-->WBXML
mapping based on AirSync(EAS) instead of XML.

Please guide me.

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: libopencorehw implementation

2009-06-15 Thread Ravi

Look at the code in release-1.0.
http://android.git.kernel.org/?p=platform/external/opencore.git;a=blob;f=android/android_surface_output.cpp;h=aa63c05d94ae265056a671aff9548769c139778b;hb=release-1.0

This was definitely a working piece of code. A very close (if not the
same) version was used in the first device release. But the code has
changed quite a bit.

The code under "samples" is a version spun off from the original, and
is merely to demonstrate the usage.

-Ravi

On Jun 15, 11:57 pm, Andy Quan  wrote:
> Hi,I find there are some sample files under
> external/opencore/android/samples demonstrating how child class of
> android_surface_output should be created. However, I did not find it
> eventually used in open source git. So my question is that, is this sample
> file the same as the one used in G1 or HTC release?
> I reused this file but there come up some problems in PMEM usage. So I
> wonder if this is a verified source code or simply a demonstration. Thank
> you.
>
> --
> Thanks,
> Andy
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: There is an error when playing http streaming file

2009-06-15 Thread Ravi

/** The video is streamed and its container is not valid for
progressive
 * playback i.e the video's index (e.g moov atom) is not at the
start of the
 * file.
 * @see android.media.MediaPlayer.OnErrorListener
 */
public static final int
MEDIA_ERROR_NOT_VALID_FOR_PROGRESSIVE_PLAYBACK = 200;

File: frameworks/base/media/java/android/media/MediaPlayer.java


On Jun 15, 9:32 pm, hengli cui  wrote:
> Who can give me some idea?
>
> 2009/6/15 hengli cui 
>
> > When I trying to play a http streaming file,there is an error and the error
> > is PVMFErrContentInvalidForProgressivePlayback.What is this error and What
> > Can I do to solve this problem.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: creating custom MIO

2009-06-12 Thread Ravi

Ohh...looks like the code has been approved but not yet merged. It is
waiting on a dependency to be merged.

You can download the changes by:
repo download platform/external/opencore 10254/2
repo download platform/frameworks/base 10274/1

Yes. It does provide the actual format type like mp3, mp4, etc.

-Ravi

On Jun 12, 4:58 am, manish  wrote:
> Ravi
>
> thanks, something like that would help. I could'nt find getFormatType
> () in playerengine or driver either on master or donut tree.
> Could you point me to the path.
> Also does it provide format resolution down to information such as
> PVMF_MP3 or is it at a higher level [ audio/video/text]
> -manish
>
> On Jun 10, 5:09 am, Ravi  wrote:
>
> > You can query the engine for the format type after the ->AddDataSource
> > () command completes. So, for your usecase, you need to add code in
> > playerdriver to use the new API that we introduced in there, called
> > getFormatType(), to get the format type of the content used in
> > "AddDataSource()". You would do this just before "AddDataSink()" so
> > that you can make a decision on which MIO to use.
>
> > -Ravi
>
> > On Jun 10, 4:04 am, manish  wrote:
>
> > > Let me re-phrase the question.
> > > How can we find out that file format is aac or mp3 before creating
> > > audiosink node in playerdriver
> > > PV engine seems to know File formats "after" source and sink nodes
> > > have been created, a little too late. The source url for local
> > > playback files is of type sharedfd://10.0.80239432 , doesn't  give
> > > much idea of file extension really. How does PV parser know which
> > > parser node to create ... it must scan the actual file for this data ,
> > > as it seems mediaplayer/UI is passing a url like above to PV.
> > > apologize if this is off topic, but some pointers would be
> > > appreciated.
>
> > > On Jun 9, 12:01 am, manish  wrote:
>
> > > > Hi
>
> > > > Need playerdriver to call custom MIO for certain Audio Format, while
> > > > continuing to call AndroidAudioOutput for others. playerdriver does
> > > > not really know the audio format type[ eg PVMF_MP3], so what would be
> > > > the best way to achieve this ? also the Generic MIO interface
> > > > [ android_audio_mio.cpp] needs this information as well, to talk to
> > > > appropriate threadsafe implementation. Currently, it is hardcoded to
> > > > AndroidAudioOutput threadsafe implementation.
>
> > > > thanks,
> > > > M
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: How to send a message or call a function written in audio decoder node from PlayerEngine?

2009-06-12 Thread Ravi

I guess I still don't understand what type of control do you require
for your decoder. In the most generic case of a multimedia framework,
one would want only minimum control of a central entity, in this case
the pvPlayer engine. The player engine in this case has the
responsibility of controlling the underlying components, including the
OMX Audio decoder node.

Is it possible to expose these interfaces out, yes. But then the
application developer would have to handle 100s of APIs to enable a
single playback session. Note that there are some cases when such an
interaction is convenient. For such cases, we do provide extension
interfaces (e.g., look at authordriver.cpp for encoder extension
interfaces) but with limited capability. Again, this may change in the
future where we may provide the flexibility to the user to control
this using some APIs of the engine itself. So, in short, the control
to the underlying nodes will not be exposed.

If you want to control your decoder from the node, it must be an OMX
API implementation. What API(s) do you want to call? What is the
purpose? We can discuss options if you spell out your usecase.

-Ravi

On Jun 12, 12:33 am, saxoappeal  wrote:
> Dear Ravi:
>
> Thank you Ravi, I think that you are a one of the most active and
> smart engineers in the gourps.
> I have seen your name on so many times, so many articles.
> Anyway,
>
> Ravi:
> What function(s) do you want to call from the decoder node?
>
> - the function written in the decoder source in /omx/omx_mp3 folder
> from the PVMFOXMAudioDecNode.
>
> Ravi:
> The interface is through OMX IL. If you integrate your component
> (decoder)
> correctly, there shouldn't be a need to modify the client (node).
> - if i want to control the decoder from the node or application, what
> can I do..?
>
> in fact, finally I want to control the decoder from the java
> application.
> So, I think that I need a way to control the decoder from the
> PlayerEngine.
> And, I guess that the node interface PVMFNodeInterface* is the key
> pointer to control the node.
> But, I am not sure it's right or not.
> And, PVMFNodeInteface is the general interface, So, I can't not refer
> to the actual PVMFOMXAudioDecNode from the Engine Source.
> Am I wrong? or What is the most important thing that I should
> understand the whole?
>
> Thanks
>
> On Jun 12, 2:10 pm, Ravi  wrote:
>
> > What function(s) do you want to call from the decoder node? The
> > interface is through OMX IL. If you integrate your component (decoder)
> > correctly, there shouldn't be a need to modify the client (node).
>
> > -Ravi
>
> > On Jun 11, 8:32 pm, saxoappeal  wrote:
>
> > > Hi All
>
> > > I have written my customized mp3 decoder in codecs/omx/omx_mp3.
> > > So, I found a way to send a message to decoder from audiodec_node in
> > > node.
>
> > > now, I need to know how to call a functions in the decoder from node
> > > or PlayerEngine.
> > > I have read the sources hundreds times, I only get the
> > > PVMFNodeInteface in the Engine.
> > > But, Every tries I had failed. and I have no idea and I am crying...
>
> > > It's maybe a kind of stupid question..but I really want to know it.
> > > Please help me...
>
> > > Thanks.- Hide quoted text -
>
> > - Show quoted text -
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: How to send a message or call a function written in audio decoder node from PlayerEngine?

2009-06-11 Thread Ravi

What function(s) do you want to call from the decoder node? The
interface is through OMX IL. If you integrate your component (decoder)
correctly, there shouldn't be a need to modify the client (node).

-Ravi

On Jun 11, 8:32 pm, saxoappeal  wrote:
> Hi All
>
> I have written my customized mp3 decoder in codecs/omx/omx_mp3.
> So, I found a way to send a message to decoder from audiodec_node in
> node.
>
> now, I need to know how to call a functions in the decoder from node
> or PlayerEngine.
> I have read the sources hundreds times, I only get the
> PVMFNodeInteface in the Engine.
> But, Every tries I had failed. and I have no idea and I am crying...
>
> It's maybe a kind of stupid question..but I really want to know it.
> Please help me...
>
> Thanks.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: creating custom MIO

2009-06-10 Thread Ravi

You can query the engine for the format type after the ->AddDataSource
() command completes. So, for your usecase, you need to add code in
playerdriver to use the new API that we introduced in there, called
getFormatType(), to get the format type of the content used in
"AddDataSource()". You would do this just before "AddDataSink()" so
that you can make a decision on which MIO to use.

-Ravi

On Jun 10, 4:04 am, manish  wrote:
> Let me re-phrase the question.
> How can we find out that file format is aac or mp3 before creating
> audiosink node in playerdriver
> PV engine seems to know File formats "after" source and sink nodes
> have been created, a little too late. The source url for local
> playback files is of type sharedfd://10.0.80239432 , doesn't  give
> much idea of file extension really. How does PV parser know which
> parser node to create ... it must scan the actual file for this data ,
> as it seems mediaplayer/UI is passing a url like above to PV.
> apologize if this is off topic, but some pointers would be
> appreciated.
>
> On Jun 9, 12:01 am, manish  wrote:
>
> > Hi
>
> > Need playerdriver to call custom MIO for certain Audio Format, while
> > continuing to call AndroidAudioOutput for others. playerdriver does
> > not really know the audio format type[ eg PVMF_MP3], so what would be
> > the best way to achieve this ? also the Generic MIO interface
> > [ android_audio_mio.cpp] needs this information as well, to talk to
> > appropriate threadsafe implementation. Currently, it is hardcoded to
> > AndroidAudioOutput threadsafe implementation.
>
> > thanks,
> > M
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: OpenCore 1.0 vs. 2.0

2009-06-09 Thread Ravi

Inline are some details. I won't be doing this again though.

On Jun 9, 2:22 pm, crack74  wrote:
> Can I get some detailed information about following items?  I am
> interested what's the user impact of following items:
> - MP3 Dynamic TOC Construction
Certain MP3 clips do not have any sync samples embedded in them.
Repositioning those clips does not always work well as expected by the
user (e.g., seekto 20 secs might end up being a seekto 30 secs). To
better achieve this, a table of contents (TOC) is constructed that
would contain the file offset and a corresponding duration.
Usefulness (in short): Better repositioning

> - mp3 parser - duration calculation by walking file in background
There are mp3 clips that do not have a duration value associated with
it in some form of metadata. For such clips, an initial rough estimate
of duration is sent to the application. Thereafter, in the background,
the entire mp3 clip is parsed in the background and the exact duration
is calculated.
Usefulness (in short): Providing more accurate duration

> - Author Engine Error Handling Robustness
Fixes for situations where the author engine had to be reset under
various error conditions. The goal is to make sure that the author
engine can be reset any time during a capturing session.
Usefulness (in short): Better stability

> - Player Engine Error Handling Robustness
Fixes for situations where the player engine had to be reset under
various error conditions.
Usefulness (in short): Better stability

> - Fundamental change in behavior of repositioning during 3GPP
> streaming
During a streaming session, say a user requests the player to do a
seek to 30 secs. However, the server starts streaming data from 20
secs. Earlier, there was a wait for 10 secs before the playback
resumed. But now, instead of showing a blank screen to the user for
the duration of 10 secss, we will start playing from 20 secs. The main
motivation is that the "source" here will not (more often than not)
feed data faster than real time.
Usefulness (in short) : Better user experience

> - Local Playback MP3 file does not display attached art work
The player engine was not able to grab the album art for some mp3
clips. This was fixed.
Usefulness (in short): Better user experience
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: Question about music player in Opencore 2.04

2009-06-09 Thread Ravi

Please look at the API "private boolean seekMethod1(int keyCode)" in
MediaPlaybackActivity.java.

-Ravi

On Jun 9, 12:05 pm, Alex_26  wrote:
> Thanks for the reply.
>
> The ones from the first row on zoom2 hardware (QWERTY). Also
> backspace to pause. That is what I thought but I just need
> confirmation. Do you know which file has event handler for the
> keyboard in the music player ?
>
> On Jun 9, 11:39 am, Ravi  wrote:
>
> > OpenCORE does not have a direct dependency on any keys on the physical/
> > virtual keyboard. That being said, applications can react to certain
> > key events and take action accordingly.
>
> > What are the keys that make the mediaplayer FF/Rew ?
>
> > -Ravi
>
> > On Jun 9, 10:29 am, Alex_26  wrote:
>
> > > Hi. This is probably a silly question but I just need confirmation. In
> > > the music player application in Opencore 2.0 we had a test case were
> > > pressing 150 short keys should not result in any strange behavior in
> > > the audio playback. Now the testing team has reported that in opencore
> > > 2.04 when you press certain keys a fast forward or rewind (depending
> > > on the key) in the playback is seen. I think this is expected so the
> > > user does not depend on the touchscreen to make seek operations.
>
> > > Can someone confirm if this is expected behavior for the music player
> > > in Opencore 2.04?
>
> > > Thanks!
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: OpenCore 1.0 vs. 2.0

2009-06-09 Thread Ravi

The changelog does list most of the issues. It is upto the user to
decide whether or not to pick the latest codebase.

-Ravi

On Jun 9, 12:02 pm, crack74  wrote:
> Are there any specific/critical issue in OpenCore 1.0 that are
> addressed in 2.0.x release?  From the changelog(http://
> android.git.kernel.org/?p=platform/external/
> opencore.git;a=blob;f=ChangeLog;h=ef3588c910e0cbb109db160ee324caf73997ea3b;hb=642e1d2b4da40c6dcb79b52bc68222ee018e77b2),
> I see many changes but not sure which changes are critical to ordinary
> user's day to day activities, like watching video, playing music, and
> etc.
>
> Can someone list the major fixes?
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: Question about music player in Opencore 2.04

2009-06-09 Thread Ravi

OpenCORE does not have a direct dependency on any keys on the physical/
virtual keyboard. That being said, applications can react to certain
key events and take action accordingly.

What are the keys that make the mediaplayer FF/Rew ?

-Ravi

On Jun 9, 10:29 am, Alex_26  wrote:
> Hi. This is probably a silly question but I just need confirmation. In
> the music player application in Opencore 2.0 we had a test case were
> pressing 150 short keys should not result in any strange behavior in
> the audio playback. Now the testing team has reported that in opencore
> 2.04 when you press certain keys a fast forward or rewind (depending
> on the key) in the playback is seen. I think this is expected so the
> user does not depend on the touchscreen to make seek operations.
>
> Can someone confirm if this is expected behavior for the music player
> in Opencore 2.04?
>
> Thanks!
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: pmem usage in opencore of Android1.5

2009-06-09 Thread Ravi

Yes. Only "fd" and "offset" are sent to the MIO.

Regarding where the MemHeapBase is constructed, it could be vendor
specific implementation. I am not sure about this.

-Ravi

On Jun 9, 7:10 am, Andy Quan  wrote:
> I have a question about pmem usage during video playback in opencore.
> In android_surface_output_fb.cpp, there is a special path for YVUSemiplanar
> format, where a private struct pointer is passed from video omx unit,
> i.e. data_header_info.private_data_ptr. 2 local functions are provided as
> "getPmemFd" and "getOffset" to achieve file descriptor and offset.
>
> My question is: does video omx pass "fd" and "offset" only to MIO or does it
> pass MemHeapBase pointer? The latter might mean that MemHeapBase is
> initially constructed inside video omx unit. I just could not recognize
> where that MemHeapBase is created. Anyone can help me understand? Thanks in
> advance!!
>
> --
> Thanks,
> Andy
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: Recognizer plugins are not getting executed when I run .mp4 file playback use case

2009-06-09 Thread Ravi

Which test application are you using? (i) pvplayer_engine_test, or
(ii) a mediaplayer test, or (iii) your own test app?

(i) As I mentioned earlier, this test app. guesses the format type
based on the file extension. So, a .mp4 file is set the format type
accordingly. If you want to force the "recognition", rename the file
extension to .xyz.

(ii) If you have proper logs, it should definitely show up.

(iii) Good luck.

-Ravi

On Jun 9, 7:45 am, Freepine  wrote:
> #define LOG_NDEBUG 0
> is required to enable LOGV.
>
> On Tue, Jun 9, 2009 at 8:38 PM, Dev  wrote:
> > Hi,
>
> > I have added the logs using android macros LOGV(""); directly.
>
> > Thanks and Regards,
> > -Devaraj
>
> > On Tue, Jun 9, 2009 at 5:56 PM, Freepine  wrote:
>
> >> How did you add the logs into mp4 recognizer? By printf? Or using android
> >> marco(e.g LOGE) directly? If you were using PV logging mechnism, did you
> >> get a logger object first?
> >> e.g.
> >> iLogger = PVLogger::GetLoggerObject("PVPlayerEngine");
>
> >> It would be helpful if you pasted your code snippet.
>
> >> On Tue, Jun 9, 2009 at 8:14 PM, Devaraj  wrote:
>
> >>> Hi All,
>
> >>> I have added few prints in the pvmp4ffrecognizer module source files.
> >>> when I ran a .mp4 file playback use case and  collected the traces.
> >>> I could not find the traces that I have added.
> >>> I strongly doubt that during the .mp4 file playback none of the
> >>> functions from the pvmp4ffrecognizer module is getting called.
> >>> I am using an android media player application to run the .mp4 file.
> >>> Is my understanding correct?
> >>> If so, without executing the mp4 recognizer how is it able to play
> >>> the .mp4 file?
>
> >>> Thanks and Regards,
> >>> -Devaraj
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: Passing pmem FDs from OMX to MIO

2009-06-09 Thread Ravi

Refer lines 1685-1710 in
http://android.git.kernel.org/?p=platform/external/opencore.git;a=blob;f=nodes/pvomxvideodecnode/src/pvmf_omx_videodec_node.cpp;h=ead6d65f91830b6cadf3f0105f61c9f07731;hb=fd46810380dc8857b48fb5100bd359a5a71f85d5.

We have the mechanism in place for the omx components to send a
pointer and a length parameters down to the MIOs.

-Ravi

On Jun 9, 8:24 am, Freepine  wrote:
> If the FD stays unchanged during playback session, perhaps you can also pass
> it to MIO via PvmiCapabilityAndConfig interface.You can check how video
> display info get passed to androidSurfaceOutput.
>
> On Tue, Jun 9, 2009 at 5:00 PM, manish  wrote:
>
> > Hi
>
> > I saw discussion where renderer allocates memory to be used by
> > decoder.
> > I would like to pass pmem fds from omx component to MIO. [ a specific
> > custom audio MIO] . Can some one elaborate on the omx->pv and pv->Mio
> > interface changes to achieve something like this ? ideal thing would
> > be to get fd as a parameter along with data pointer , data length ,
> > cmdid, timestamp etc in MIO through the writeasync call. and then
> > again , it is needed only for a specific MIO implementation, without
> > having to change writeasync api for others.
>
> > thanks
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: opencore PVPlayerEngine::SetPlaybackRate cannot work

2009-06-08 Thread Ravi

PlaybackRate has not been implemented for mp3.

PVMFStatus PVMFMP3FFParserNode::DoSetDataSourceRate
(PVMFMP3FFParserNodeCommand& aCmd)
{
PVLOGGER_LOGMSG(PVLOGMSG_INST_LLDBG, iLogger,
PVLOGMSG_STACK_TRACE,
(0, "PVMFMP3FFParserNode::DoSetDataSourceRate()
In"));
OSCL_UNUSED_ARG(aCmd);
return PVMFSuccess;
}

On Jun 8, 10:50 pm, allstars  wrote:
> hello world
> i want to test rate feature for mp3
> so i add setRate from top to the bottom
> (Java -> JNI -> libmedia -> libmediaplayerservice -> playerdriver in
> opencore)
> however i found PVPlayerEngine::SetPlaybackRate didnt do any thing
> i have also add logs for it and in the DoSetPlaybackRate function
> no error is found
>
> i call it like this SetPlaybackRate(200,000, 0, cmd);
>
> however ,if i can GetPlaybackMinMaxRate , i can get min/max rate
> returned in the
> asyn handler
>
> btw i dont know how to use GetPlaybackRate , there is no sample for
> this
> and i am not sure how to use the Timebase argument
>
> have anyone met the same problem as i did???
>
> thanks
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: What should be the configuration contents of pvlogger.txt to get the traces from modules of \pvmi\recognizer\plugins?

2009-06-08 Thread Ravi

Repost. I responded in the other thread.

On Jun 8, 6:14 am, Devaraj  wrote:
> As we enter 8,PvPlayerEngine in the pvlogger.txt to get the traces
> from PVplayerEngine what should be the configuration contents of
> pvlogger.txt for the modules preset in \pvmi\recognizer\plugins?
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: a/v sync timing in Android 1.5

2009-06-08 Thread Ravi

That is correct. The late margin is what makes the framework drop
video frames.

-Ravi

On Jun 7, 10:53 pm, Andy Quan  wrote:
> Hi,I met some severe frame drops in high motion movie playback in Android
> 1.5. I know that there are "early margin" and "late margin" in
> "pv_player_engine_tunables.h". I'd like to know if these are the only 2
> parameters I should try to tune for this problem. Any other parameters I
> should pay attention to? Thanks in advance.
>
> --
> Thanks,
> Andy
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: How to get traces from the pvmi\recognizer\plugins\pvmp4ffrecognizer\src files

2009-06-08 Thread Ravi

Are you using the pvplayer_engine_test app. or the android media
player?

If pvplayer_engine_test, the test app. assigns the format type of the
content being used based on the file extension. So, there is no
recognition. If you want to force it, you can just rename the file as
test.xyz (i.e., an invalid extension).

If android media player, I don't know. Since you already have the logs
from the other components, maybe you can figure it out.

-Ravi

On Jun 8, 4:42 am, Devaraj  wrote:
> Hi,
>
> I have added some prints like the below:
>
>  PVLOGGER_LOGMSG(PVLOGMSG_INST_LLDBG, iLogger, PVLOGMSG_STACK_TRACE,
> (0, "PVMP4FFRecognizerPlugin::GetRequiredMinBytesForRecognition() IN
> \n"));
>
> in the file pvmp4ffrec_plugin.cpp. I am not getting these traces when
> I run the MP4 video file where as I am getting the traces from other
> module source files. what may be the reason?
>
> Thanks and Regards,
> -Devaraj
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: Can't generated video thumbnails in Android 1.5

2009-06-08 Thread Ravi

In this mode, the parsers look for the best sync sample from the first
10 sync samples. "Best" here is defined as the biggest frame in terms
of bytes with the rationale that an I-frame with more number of bytes
will have more pictorial information compared to the others.

So, first and foremost, you need to check how many I-frames is your
encoder producing, if any? If there are I-frames, look at the mp4
parser node to see why you are not seeing any thumbnail. Or, file a
bug, along with the clip in question, and we can take a look.

-Ravi

On Jun 8, 1:41 am, Jia Meng  wrote:
> I found there is a macro "BEST_THUMBNAIL_MODE" defined in
> metadriver.h. If it is turned on, video thumbnails can't be generated
> in Gallery for streams recorded by our own MEPG-4 encoder. What does
> "best thumbnail mode" mean? How should I modify our streams to work
> under this mode?
>
> Thanks
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: Access Opencore engine through JNI

2009-06-08 Thread Ravi

It is not advisable to bypass "mediaplayer" to reach directly PVPlayer
class. There won't be any guarantee that your app. (and your JNI
layer) will be compatible for future use cases.

OpenCORE can technically be instantiated as thread-safe. However, in
the current version, it has not yet been done. So, there is a
playerdriver level which queues all commands from various threads to a
single thread.

-Ravi

On Jun 5, 10:40 pm, allstars  wrote:
> hello world
> i would like to use some API in opencore/sonivox
> but these API aren't included in libmediaplayerservice ,nor
> android.media
>
> so i would like to add my JNI to construct a PVPlayer like the
> libmediaplayerservice did
> is this ok?
> i mean is it thread-safe in opencore engine
>
> i found i need to modify a lot files if i need to go through libmedia/
> libmediaplayerservice
> for example IMlediaPlayer, IMediaPlayerService, etc.
>
> thanks
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: Failure [INSTALL_FAILED_MISSING_SHARED_LIBRARY]

2009-06-02 Thread Ravi

+ 1

The logcat shows the following:
*
06-02 23:54:27.410   583   604 D PackageParser: Scanning package: /
system/app/PlatformLibraryClient.apk
06-02 23:54:27.450   583   604 I PackageManager: /system/app/
PlatformLibraryClient.apk changed; collecting certs
06-02 23:54:28.040   583   604 D PackageManager: Scanning package
com.example.android.platform_library.client
06-02 23:54:28.050   583   604 E PackageManager: Package
com.example.android.platform_library.client requires unavailable
shared library com.example.android.platform_library; ignoring!
*
I verified that the required libraries are present on the emulator.

syncing /system...
push: /home/korg/mainline/out/target/product/generic/system/lib/
libplatform_library_jni.so -> /system/lib/libplatform_library_jni.so
push: /home/korg/mainline/out/target/product/generic/system/app/
PlatformLibraryClient.apk -> /system/app/PlatformLibraryClient.apk
push: /home/korg/mainline/out/target/product/generic/system/etc/
permissions/com.example.android.platform_library.xml -> /system/etc/
permissions/com.example.android.platform_library.xml
push: /home/korg/mainline/out/target/product/generic/system/framework/
com.example.android.platform_library.jar -> /system/framework/
com.example.android.platform_library.jar
4 files pushed. 311 files skipped.
20 KB/s (10184 bytes in 0.477s)
*

On May 21, 9:39 am, Prasad Duggineni 
wrote:
> summit,
>
> When i try to run the client app, i have below error. package.apk which
> is copied from
>
> \out\target\product\obj\APPS\PlatformLibraryClient_intermediates
>
> C:\android-sdk-windows-1.1_r1\tools>adb push libplatform_library_jni.so
> /system/frameworks
> 1108 KB/s (0 bytes in 53227.000s)
> C:\android-sdk-windows-1.1_r1\tools>adb push libplatform_library_jini.so
> /system
> /frameworks
> C:\android-sdk-windows-1.1_r1\tools>adb install package.apk
> 215 KB/s (0 bytes in 3447.000s)
>         pkg: /data/local/tmp/package.apk
> Failure [INSTALL_FAILED_MISSING_SHARED_LIBRARY]
> let me know how did u resolve the missing shared library issue?
>
> On Tue, May 19, 2009 at 11:32 PM, sumit  wrote:
>
> > can u give me more clearance of ur problem .
> > actually i had made same kind of thing of platform library  and one
> > testing application to use my API and when i tried to install this
> > test application its give me error but now i resolve this error..so
> > tell me which kind of problem do u have ..
>
> > On May 15, 12:32 am, dvp  wrote:
> > > Naveen,
> > > I had similar issue even though installed it using the adb push
> > > libplatform_library_jni.so /system/lib/libplatform_library_jni.so. any
> > > suggestion to resolve this issue?
>
> > > Prasad
>
> > > On May 6, 3:04 am, sumit  wrote:
>
> > > > wt is the right place for it ,
> > > > .so is generated in --
> > > > out/target/product/generic/obj/SHARED_LIBRARIES/
> > > > libimsfwk_intermediates/LINKED
> > > > place .
>
> > > > On May 6, 11:28 am, XC He  wrote:
>
> > > > > did you copy the .so to the right place?
>
> > > > > Best Regards
> > > > > ===
> > > > > XiangCheng He
>
> > > > > 2009/5/6 Dianne Hackborn 
>
> > > > > > Did you read the documentation there about how to register the
> > library with
> > > > > > the system?  And did you actually build it into the system?
>
> > > > > > On Tue, May 5, 2009 at 3:06 AM, sumit 
> > wrote:
>
> > > > > >> i want install /root/mydroid/MyAndroid/development/samples/
> > > > > >> PlatformLibrary/client
> > > > > >> intermediated/pacage.apk  into emulator
>
> > > > > >> but i got this error --
> > > > > >> :~/mydroid/MyAndroid/out/target/product/generic/obj/APPS/
> > > > > >> PlatformLibraryClient_intermediates# adb install package.apk
> > > > > >> 41 KB/s (3448 bytes in 0.080s)
> > > > > >>        pkg: /data/local/tmp/package.apk
> > > > > >> Failure [INSTALL_FAILED_MISSING_SHARED_LIBRARY]
>
> > > > > >> so what i will for that
> > > > > >> can any body tell me .
>
> > > > > >> by using using this i try to add some api in android source code
> > but
> > > > > >> my first step is not working properly.
>
> > > > > >> thanks in advance ...
>
> > > > > > --
> > > > > > Dianne Hackborn
> > > > > > Android framework engineer
> > > > > > hack...@android.com
>
> > > > > > Note: please don't send private questions to me, as I don't have
> > time to
> > > > > > provide private support, and so won't reply to such e-mails.  All
> > such
> > > > > > questions should be posted on public forums, where I and others can
> > see and
> > > > > > answer them.- Hide quoted text -
>
> > > > - Show quoted text -
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more o

Re: aac header info in PVAuthor

2009-04-26 Thread Ravi

1. What is the format PVMF_MIME_AAC_MP4FF? Is it AAC in mpeg4 format?
If so, the format is "PVMF_MIME_MPEG4_AUDIO".

2. It should be the 2 bytes of AudioSpecificConfig.

On Apr 26, 11:44 am, Andy Quan  wrote:
> Hi,
> I have a few questions with regarding to AAC encoder OMX integration in
> OpenCORE v2.x.
>
> 1. If the output format is PVMF_MIME_AAC_MP4FF, is it the OMX IL's
> responsibility to generate AudioSpecificConfig defined in ISO spec?
>
> 2. In FillBufferDoneProcessing, it is indicated that the first output buffer
> of AAC encoder should be codec specific header info, is this "codec header
> info" exactly the struct of "AudioSpecificConfig"? Any more information
> involved?
>
> --
> Thanks,
> Andy
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: H263 video recording

2009-04-23 Thread Ravi

SW H.263 encoder has been tested on the device. It works just fine.

I don't know what fake camera is.

On Apr 23, 10:51 am, Dave Sparks  wrote:
> This is not a scenario that we have ever tested internally. The
> software H.263 encoder has never been tested by Google and I doubt
> that it has ever been tested in the Android tree by PV.
>
> On Apr 23, 6:34 am, Lucien  wrote:
>
> > Dear all :
> > I use default camera AP and switch to video mode to do video
> > recording.
> > The video encoder is H.263 and audio encoder is nb-amr.
> > I can record with fake camera.
> > But the result is not correct.
> > Because when I display the recorded file on PC,I can hear the voice
> > but only see some frames.
> > In addition, the recorded file can't be played by Android media
> > player.
> > It seems like the h.263 encoder doesn't encode the data correctly.
> > Could anybody kindly help me to use default PV software codec to do
> > the video recording with fake camera.
> > Is there any setting or configure I need to modify?
> > Thanks!!
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: encoder support in OpenCORE/cupcake

2009-04-22 Thread Ravi

Let me try to summarize the many questions that I so far received in
this context:
OpenCORE 1.0 on master had the author engine working with the non-OMX
codecs as well. On Cupcake, this is not the case. Cupcake has an OMX
node that supports mpeg-4 OMX video component (in this case, the HW
OMX component). But, if the HW component is not present, it is
expected to pick the non-OMX SW codec. This functionality is missing.


On Apr 21, 7:16 pm, Dave Sparks  wrote:
> OpenCORE 1.0 does not have any built-in OMX software encoders, so
> there is no need to support them.
>
> On Apr 21, 8:50 am, Andy Quan  wrote:
>
> > Thanks, Ravi.
>
> > "In Cupcake, the OMX encoder component is hard-coded into the OMX
> > encoder node. It's not an issue because there are no software OMX
> > encoders. "
>
> > What does this mean? Hardware OMX component for internal use?
>
> > On Tue, Apr 21, 2009 at 8:32 PM, Ravi  wrote:
>
> > > Repost ...
>
> > >http://groups.google.com/group/android-framework/browse_thread/thread...
>
> > > On Apr 21, 7:27 am, Andy Quan  wrote:
> > > > I find in ./engines/author/src/single_core/pvaenodefactoryutility.h,
> > > > "PVMFOMXVideoEncNodeFactory::CreateVideoEncNode()" is called to create
> > > video
> > > > node. In my understanding, this means OMX is expected to be used in
> > > > PVAuthor. However, I did not find any encoder OMX provided by PV under
> > > > opencore folder...
>
> > > > My question is: is encoding path in opencore of cupcake tested? I think
> > > > theoratically 3rd party OMX should work in that node but I have no idea
> > > > whether that node itself can work correctly.
>
> > > > Many thanks in advance!
>
> > > > --
> > > > Thanks,
> > > > Andy
>
> > --
> > Thanks,
> > Andy
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: SBC integration license issue in BlueZ/Android

2009-04-21 Thread Ravi

Bumping this up.

Can someone from Google please comment on this?

-Ravi

On Apr 20, 12:56 am, Ravi  wrote:
> I do see the GPL license. That's bizarre. I was under the impression
> that the entire android tree has to be provided with the Apache
> license. Maybe someone from Google can confirm.
>
> Just fyi...PacketVideo has provided their version of the SBC encoder
> already as part of OpenCORE [external/opencore/codecs_v2/audio/sbc].
>
> -Ravi
>
> On Apr 20, 12:13 am, Andy Quan  wrote:
>
> > Hi,
> > As far as I remember, BlueZ is released under GPL license. Is this the same
> > in android? If so, does this mean if I want to integrate my customized SBC
> > encoder lib into this framework, I have to make my encoder open source?
>
> > --
> > Thanks,
> > Andy
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: encoder support in OpenCORE/cupcake

2009-04-21 Thread Ravi

Yes. I believe the change was made to use only the HW OMX encoders.

On Apr 21, 10:50 am, Andy Quan  wrote:
> Thanks, Ravi.
>
> "In Cupcake, the OMX encoder component is hard-coded into the OMX
> encoder node. It's not an issue because there are no software OMX
> encoders. "
>
> What does this mean? Hardware OMX component for internal use?
>
>
>
> On Tue, Apr 21, 2009 at 8:32 PM, Ravi  wrote:
>
> > Repost ...
>
> >http://groups.google.com/group/android-framework/browse_thread/thread...
>
> > On Apr 21, 7:27 am, Andy Quan  wrote:
> > > I find in ./engines/author/src/single_core/pvaenodefactoryutility.h,
> > > "PVMFOMXVideoEncNodeFactory::CreateVideoEncNode()" is called to create
> > video
> > > node. In my understanding, this means OMX is expected to be used in
> > > PVAuthor. However, I did not find any encoder OMX provided by PV under
> > > opencore folder...
>
> > > My question is: is encoding path in opencore of cupcake tested? I think
> > > theoratically 3rd party OMX should work in that node but I have no idea
> > > whether that node itself can work correctly.
>
> > > Many thanks in advance!
>
> > > --
> > > Thanks,
> > > Andy
>
> --
> Thanks,
> Andy
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: amr-wb codec in .3gp file format

2009-04-21 Thread Ravi

What happens if you push a new mp4 file? Can you see the file doing
"adb shell ls -l /sdcard" ?

For the file to be detected, did you try restarting the emulator?

For new formats to be detected by the mediascanner, you need to start
from frameworks/base/media/java/android/media/MediaFile.java.



On Apr 21, 7:47 am, chuan  wrote:
> When push file below is showed:
> adb push vec.qcp sdcard/
> 1585 KB/s (0 bytes in 126827.000s)
>
> On Apr 21, 8:28 pm, Ravi  wrote:
>
> > When was the last time you did your "repo sync"?
>
> > When you "push" the content, do you see any error? Are you sure the
> > sdcard is working for any other application?
>
> > Note that on master, the sdcard functionality is 
> > broken.http://code.google.com/p/android/issues/detail?id=2335
>
> > On Apr 21, 6:30 am, chuan  wrote:
>
> > > I modify MediaScanner.java MediaFile.java too, but no file can be
> > > scanned out successfully as well as .mp4/3gp/3g2
>
> > > On Apr 21, 6:23 pm, chuan  wrote:
>
> > > > Dear Ravi,
> > > >I met a problem when testing video/audio playbak on opencore 2.0, I
> > > > can't scan out media files that I push into the sdcard, my step as
> > > > below:
> > > >1.  adb push test.mp4 sdcard/  (or other audio/video files)
> > > >2.  reset emulator and run it using "emulator -sdcard sdcard.img"
> > > >This happen when I change opencore from 1.x to 2.0. In opencore 1.x
> > > > it can always work and i can find media file in Music or VideoPlayer
> > > > app, but now i cannot get them.
> > > >The difference is In opencore 1.x I useScannerin Dev Tool to scan
> > > > files,while in 2.0 I have  to reset  emulator( it seems scan work is
> > > > done when emulator is set up). I only get a file occasionally.
> > > >AND, if I want to scan out a new file format(such as .awb or .qcp),
> > > > what need to be done except modify mediascanner.cpp ? ( only scan out,
> > > > other work is beyond consideration)
> > > >Many thanks!
>
> > > > On Apr 15, 10:40 pm, Ravi  wrote:
>
> > > > > On Apr 15, 7:11 am, chuan  wrote:> Dear Ravi:
> > > > > >Thank so much for your quickly reply, as i am try hard to figure 
> > > > > > it
> > > > > > out .
> > > > > > Now i wish to share my plan and wish get help from you.
> > > > > > 1. modify mp4 parer src code to support the audio codec(modify
> > > > > > atomdefs.h  sampledescriptionatom.h and some other .cpp).
> > > > > > 2. modify mp4parser node code ,add audio codec type  support.
> > > > > > 3. use OpenCORE OMX-core to register my new omx component, add
> > > > > > codec type or registration to omx related file.
> > > > > > from above steps, i have some question:
> > > > > >(1) as you know opencore version 1.x  has a simpler omx component
> > > > > > interface, but in 2.0  the interface is more complex, why? it make 
> > > > > > me
> > > > > > to wrap the interface
> > > > > >  in cpp file.
>
> > > > >  Actually, I don't know this. I am not aware of any change of OMX
> > > > > component interface. Can you please point me to an example?>(2) 
> > > > > and if  i want use  my own OMX-core (i'm going to do so), how
> > > > > > can i do the job as you say? i read the omx intergrate doc, but  i
> > > > > > could not understand it fully.
>
> > > > > Here is a thread where someone was able to do 
> > > > > so.http://groups.google.com/group/android-framework/browse_thread/thread
> > > > > Let us know if you still have questions.>(3) today i try to get 
> > > > > the pv log out to debug ,but failed to do
> > > > > > so, as you know in version 1.x i can modify two head files and one 
> > > > > > cpp
> > > > > > to get the log out, i have try to  make with
> > > > > >  ENABLE_PV_LOGGING=1  and even try to modify some log 
> > > > > > related
> > > > > > head files, but i got nothing. you know it cost a lot of time, and
> > > > > > can't use log message to debug make me upset.
>
> > > > > This should still work. Folks here have also been able to get logs
> > > > > through this m

Re: encoder support in OpenCORE/cupcake

2009-04-21 Thread Ravi

Repost ...

http://groups.google.com/group/android-framework/browse_thread/thread/ef0d6aaab8390ea3/


On Apr 21, 7:27 am, Andy Quan  wrote:
> I find in ./engines/author/src/single_core/pvaenodefactoryutility.h,
> "PVMFOMXVideoEncNodeFactory::CreateVideoEncNode()" is called to create video
> node. In my understanding, this means OMX is expected to be used in
> PVAuthor. However, I did not find any encoder OMX provided by PV under
> opencore folder...
>
> My question is: is encoding path in opencore of cupcake tested? I think
> theoratically 3rd party OMX should work in that node but I have no idea
> whether that node itself can work correctly.
>
> Many thanks in advance!
>
> --
> Thanks,
> Andy
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: amr-wb codec in .3gp file format

2009-04-21 Thread Ravi

When was the last time you did your "repo sync"?

When you "push" the content, do you see any error? Are you sure the
sdcard is working for any other application?

Note that on master, the sdcard functionality is broken.
http://code.google.com/p/android/issues/detail?id=2335



On Apr 21, 6:30 am, chuan  wrote:
> I modify MediaScanner.java MediaFile.java too, but no file can be
> scanned out successfully as well as .mp4/3gp/3g2
>
> On Apr 21, 6:23 pm, chuan  wrote:
>
> > Dear Ravi,
> >I met a problem when testing video/audio playbak on opencore 2.0, I
> > can't scan out media files that I push into the sdcard, my step as
> > below:
> >1.  adb push test.mp4 sdcard/  (or other audio/video files)
> >2.  reset emulator and run it using "emulator -sdcard sdcard.img"
> >This happen when I change opencore from 1.x to 2.0. In opencore 1.x
> > it can always work and i can find media file in Music or VideoPlayer
> > app, but now i cannot get them.
> >The difference is In opencore 1.x I use Scanner in Dev Tool to scan
> > files,while in 2.0 I have  to reset  emulator( it seems scan work is
> > done when emulator is set up). I only get a file occasionally.
> >AND, if I want to scan out a new file format(such as .awb or .qcp),
> > what need to be done except modify mediascanner.cpp ? ( only scan out,
> > other work is beyond consideration)
> >Many thanks!
>
> > On Apr 15, 10:40 pm, Ravi  wrote:
>
> > > On Apr 15, 7:11 am, chuan  wrote:> Dear Ravi:
> > > >Thank so much for your quickly reply, as i am try hard to figure it
> > > > out .
> > > > Now i wish to share my plan and wish get help from you.
> > > > 1. modify mp4 parer src code to support the audio codec(modify
> > > > atomdefs.h  sampledescriptionatom.h and some other .cpp).
> > > > 2. modify mp4parser node code ,add audio codec type  support.
> > > > 3. use OpenCORE OMX-core to register my new omx component, add
> > > > codec type or registration to omx related file.
> > > > from above steps, i have some question:
> > > >(1) as you know opencore version 1.x  has a simpler omx component
> > > > interface, but in 2.0  the interface is more complex, why? it make me
> > > > to wrap the interface
> > > >  in cpp file.
>
> > >  Actually, I don't know this. I am not aware of any change of OMX
> > > component interface. Can you please point me to an example?>(2) and 
> > > if  i want use  my own OMX-core (i'm going to do so), how
> > > > can i do the job as you say? i read the omx intergrate doc, but  i
> > > > could not understand it fully.
>
> > > Here is a thread where someone was able to do 
> > > so.http://groups.google.com/group/android-framework/browse_thread/thread
> > > Let us know if you still have questions.>(3) today i try to get the 
> > > pv log out to debug ,but failed to do
> > > > so, as you know in version 1.x i can modify two head files and one cpp
> > > > to get the log out, i have try to  make with
> > > >  ENABLE_PV_LOGGING=1  and even try to modify some log related
> > > > head files, but i got nothing. you know it cost a lot of time, and
> > > > can't use log message to debug make me upset.
>
> > > This should still work. Folks here have also been able to get logs
> > > through this method. Let me see if I can patch up something to get the
> > > logs more easily.
>
> > > > looking forward for your  reply expecial question(3) , thx!
>
> > > > On Apr 15, 4:10 pm, Ravi  wrote:
>
> > > > > I don't know enough about QCELP to comment on how and if to modify the
> > > > > mp4 parser. I can try to find out.
>
> > > > > In general, the steps to add a new OMX codec would be:
> > > > >(i) Add the required format, if not present, in pvmi/pvmf/include/
> > > > > pvmf_format_type.h.
> > > > >(ii) Add the mime type in the capability of PVMFOMXAudioDecNode()
> > > > > or PVMFOMXVideoDecNode() [see the constructor].
> > > > > if you are going to use the OpenCORE OMX-core ...
> > > > >(iii) Register the component in the omx registry - codecs_v2/omx/
> > > > > omx_common/src/pv_omxregistry.cpp.
> > > > > or
> > > > > if you are going to provide your own OMX-core ...
> > > > >(iii) Have your OMX-core register this ne

Re: Using pthreads in Android

2009-04-21 Thread Ravi

Make an attempt...

On Apr 21, 1:34 am, nightwish  wrote:
> Can i typecast android_thread_id_t to pthread_t?  I am creating thread
> using createThreadEtc. and using android::Mutex and android::Condition
> for synchronization. For pthread_join i need to pass pthread_t as
> parameter.
>
> On Apr 20, 10:00 pm, Jeff Hamilton  wrote:
>
> > Bionic does have support for pthreads, including pthread_join(). See
> > bionic/libc/include/pthread.h.
>
> > -Jeff
>
> > On Mon, Apr 20, 2009 at 9:02 AM, nightwish  wrote:
>
> > > Is it advisable to use pthread APIs in Andriod library layer? I need
> > > to use pthread_join for my component written in C++. But unfortunately
> > > i dint find any APIs which support this is Threads.cpp.
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: SBC integration license issue in BlueZ/Android

2009-04-19 Thread Ravi

I do see the GPL license. That's bizarre. I was under the impression
that the entire android tree has to be provided with the Apache
license. Maybe someone from Google can confirm.

Just fyi...PacketVideo has provided their version of the SBC encoder
already as part of OpenCORE [external/opencore/codecs_v2/audio/sbc].

-Ravi

On Apr 20, 12:13 am, Andy Quan  wrote:
> Hi,
> As far as I remember, BlueZ is released under GPL license. Is this the same
> in android? If so, does this mean if I want to integrate my customized SBC
> encoder lib into this framework, I have to make my encoder open source?
>
> --
> Thanks,
> Andy
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: Regarding Audio/Video sink in Opencore framework.

2009-04-15 Thread Ravi

>From what I understand, the OP wants to write test cases to the
OpenCORE codebase that is provided from opensource. But, the attempt
is to write testcases using the media player API. That is not
possible.

If one wants to write test cases to the OpenCORE framework, they have
to use the opensource version. We have ample examples that show how
test cases can be written.

If one is using the SDK release, they have to use the existing media
player APIs (provided only in JAVA).



On Apr 15, 8:12 am, John Write  wrote:
> Hi Ravi,
>
> your we should not use Android API??? rt. So what you mean by exactly Java
> level API...
>
> thx
> John
>
> On Wed, Apr 15, 2009 at 7:42 PM, Ravi  wrote:
>
> > Please read the documentation at opencore/doc/
> > pvplayer_developers_guide.pdf. This provides the high-level
> > description of the PV architecture.
>
> > OpenCORE is a native level code. If you are using the "SDK release",
> > you cannot write test cases for our code. You have to use the "media
> > player" java level APIs.
>
> > On Apr 15, 7:05 am, sush  wrote:
> > > Hi,
> > > Thanks for your quick reply.
> > > Actually I was going through the Open Core PV testcases documentation
> > > that they had given for r1.1, and in those testcases they have
> > > mentioned something about OutsideNodeForVideoSink testCase and the
> > > description is as follows for the Multimedia Engine:
>
> > > It creates an Media output node for the video Data
> > > Sink
> > > Step1. Open the video file
> > > Step 2. Play the file for 7 seconds
> > > Step 3. Stop the playback.
>
> > > What is this Media Output Node..? Any idea on this will be
> > > appreciable.
>
> > > And in some more testcases they say of removing and adding the video
> > > sinks.I am developing testcases in the application level. And no such
> > > APIs are exposed in the androidSDK r1.1 so far in the android.media
> > > package. Any idea on how to go with this would be of great help.
>
> > > Thanks,
> > > Sush
>
> > > On Apr 15, 1:14 pm, Ravi  wrote:
>
> > > > Source --- Where the input data originates. E.g., A file (local
> > > > playback).
> > > > Sink --- Where the output data culminates. E.g., Screen (video) or
> > > > Speaker (audio)
>
> > > > On Apr 14, 11:31 pm, sush  wrote:
>
> > > > > Hi,
>
> > > > > I was trying to develop test scripts for the PV Player TestCases that
> > > > > were published in the source code version r1.1. In some of the
> > > > > testcases they talk about the audio/video sink. I just wanted to know
> > > > > what does these sinks do and is it possible to create a dummy sink
> > and
> > > > > read/play the media file from that sink.
>
> > > > > Please help me in this.
>
> > > > > Regards,
> > > > > Sush- Hide quoted text -
>
> > > > - Show quoted text -
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: amr-wb codec in .3gp file format

2009-04-15 Thread Ravi



On Apr 15, 7:11 am, chuan  wrote:
> Dear Ravi:
>Thank so much for your quickly reply, as i am try hard to figure it
> out .
> Now i wish to share my plan and wish get help from you.
> 1. modify mp4 parer src code to support the audio codec(modify
> atomdefs.h  sampledescriptionatom.h and some other .cpp).
> 2. modify mp4parser node code ,add audio codec type  support.
> 3. use OpenCORE OMX-core to register my new omx component, add
> codec type or registration to omx related file.
> from above steps, i have some question:
>(1) as you know opencore version 1.x  has a simpler omx component
> interface, but in 2.0  the interface is more complex, why? it make me
> to wrap the interface
>  in cpp file.
 Actually, I don't know this. I am not aware of any change of OMX
component interface. Can you please point me to an example?
>(2) and if  i want use  my own OMX-core (i'm going to do so), how
> can i do the job as you say? i read the omx intergrate doc, but  i
> could not understand it fully.
Here is a thread where someone was able to do so.
http://groups.google.com/group/android-framework/browse_thread/thread/f449fc4dc16003de/.
Let us know if you still have questions.
>(3) today i try to get the pv log out to debug ,but failed to do
> so, as you know in version 1.x i can modify two head files and one cpp
> to get the log out, i have try to  make with
>  ENABLE_PV_LOGGING=1  and even try to modify some log related
> head files, but i got nothing. you know it cost a lot of time, and
> can't use log message to debug make me upset.
This should still work. Folks here have also been able to get logs
through this method. Let me see if I can patch up something to get the
logs more easily.
>
>     looking forward for your  reply expecial question(3) , thx!
>
> On Apr 15, 4:10 pm, Ravi  wrote:
>
> > I don't know enough about QCELP to comment on how and if to modify the
> > mp4 parser. I can try to find out.
>
> > In general, the steps to add a new OMX codec would be:
> >(i) Add the required format, if not present, in pvmi/pvmf/include/
> > pvmf_format_type.h.
> >(ii) Add the mime type in the capability of PVMFOMXAudioDecNode()
> > or PVMFOMXVideoDecNode() [see the constructor].
> > if you are going to use the OpenCORE OMX-core ...
> >(iii) Register the component in the omx registry - codecs_v2/omx/
> > omx_common/src/pv_omxregistry.cpp.
> > or
> > if you are going to provide your own OMX-core ...
> >(iii) Have your OMX-core register this new component.
> >(iv) Have your OMX-core dynamically loadable.
>
> > Not sure about your build failures. Can you post snippets of
> > failures?
>
> > On Apr 14, 11:27 pm, chuan  wrote:
>
> > > Dear Ravi:
> > > question 1:
> > > if i want intergrate other audio codec in opencore 2.0 such as
> > > qcelp in 3gp/3g2 file, how can i modify mp4 parser to let it work?
> > > this audio codec omx components are perpared, new codec types are
> > > already define in pvmi\pvmf\include\pvmf_format_type.h, any where i
> > > can add this codec is added
> > > if you have any sugestion , pls let me knew
> > > question2:
> > > to enable looger  i use cmd below
> > >rm -rf out/target/product/generic/obj/include/libpv
> > >make -j ENABLE_PV_LOGGING=1
>
> > >but i find it cannot compile through, a lot of killed appears, why?
>
> > >   looking farward for your kindly reply.
>
> > > On Feb 27, 10:27 pm, rktb  wrote:
>
> > > > It should be a new format in the existing container. Go through the
> > > > existing mp4 parser and try to understand how it is structured. We can
> > > > help you with specific questions.
>
> > > > Btw, I am assuming that you would be contributing this work back to
> > > > OHA community.
>
> > > > -Ravi
>
> > > > On Feb 27, 1:21 am, Lucien  wrote:
>
> > > > > Hi,
> > > > > Is there any documents,hints,or suggestions for me if I want
> > > > > to implement mp3 codec in mp4 ?
> > > > > How to start this work ?
> > > > > And is the effort hugeous ?
> > > > > In addition, does PV have plan or roadmap to support this
> > > > > functionality
> > > > > Thanks for your response
>
> > > > > On 2月24日, 下午9時13分, rktb  wrote:
>
> > > > > > Hi,
>
> > > > > > There is no special concern in supporting mp3 in mp4. It has not 
> > &

Re: Regarding Audio/Video sink in Opencore framework.

2009-04-15 Thread Ravi



Please read the documentation at opencore/doc/
pvplayer_developers_guide.pdf. This provides the high-level
description of the PV architecture.

OpenCORE is a native level code. If you are using the "SDK release",
you cannot write test cases for our code. You have to use the "media
player" java level APIs.

On Apr 15, 7:05 am, sush  wrote:
> Hi,
> Thanks for your quick reply.
> Actually I was going through the Open Core PV testcases documentation
> that they had given for r1.1, and in those testcases they have
> mentioned something about OutsideNodeForVideoSink testCase and the
> description is as follows for the Multimedia Engine:
>
> It creates an Media output node for the video Data
> Sink
> Step1. Open the video file
> Step 2. Play the file for 7 seconds
> Step 3. Stop the playback.
>
> What is this Media Output Node..? Any idea on this will be
> appreciable.
>
> And in some more testcases they say of removing and adding the video
> sinks.I am developing testcases in the application level. And no such
> APIs are exposed in the androidSDK r1.1 so far in the android.media
> package. Any idea on how to go with this would be of great help.
>
> Thanks,
> Sush
>
> On Apr 15, 1:14 pm, Ravi  wrote:
>
> > Source --- Where the input data originates. E.g., A file (local
> > playback).
> > Sink --- Where the output data culminates. E.g., Screen (video) or
> > Speaker (audio)
>
> > On Apr 14, 11:31 pm, sush  wrote:
>
> > > Hi,
>
> > > I was trying to develop test scripts for the PV Player TestCases that
> > > were published in the source code version r1.1. In some of the
> > > testcases they talk about the audio/video sink. I just wanted to know
> > > what does these sinks do and is it possible to create a dummy sink and
> > > read/play the media file from that sink.
>
> > > Please help me in this.
>
> > > Regards,
> > > Sush- Hide quoted text -
>
> > - Show quoted text -
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: Regarding Audio/Video sink in Opencore framework.

2009-04-15 Thread Ravi

Source --- Where the input data originates. E.g., A file (local
playback).
Sink --- Where the output data culminates. E.g., Screen (video) or
Speaker (audio)

On Apr 14, 11:31 pm, sush  wrote:
> Hi,
>
> I was trying to develop test scripts for the PV Player TestCases that
> were published in the source code version r1.1. In some of the
> testcases they talk about the audio/video sink. I just wanted to know
> what does these sinks do and is it possible to create a dummy sink and
> read/play the media file from that sink.
>
> Please help me in this.
>
> Regards,
> Sush
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: amr-wb codec in .3gp file format

2009-04-15 Thread Ravi

What you are trying above is correct.

On Apr 14, 11:48 pm, chuan  wrote:
> add one question:
> what's the simplest method to see the complete log of opencore?
>
> On Feb 27, 10:27 pm, rktb  wrote:
>
> > It should be a new format in the existing container. Go through the
> > existing mp4 parser and try to understand how it is structured. We can
> > help you with specific questions.
>
> > Btw, I am assuming that you would be contributing this work back to
> > OHA community.
>
> > -Ravi
>
> > On Feb 27, 1:21 am, Lucien  wrote:
>
> > > Hi,
> > > Is there any documents,hints,or suggestions for me if I want
> > > to implement mp3 codec in mp4 ?
> > > How to start this work ?
> > > And is the effort hugeous ?
> > > In addition, does PV have plan or roadmap to support this
> > > functionality
> > > Thanks for your response
>
> > > On 2月24日, 下午9時13分, rktb  wrote:
>
> > > > Hi,
>
> > > > There is no special concern in supporting mp3 in mp4. It has not been
> > > > done yet. OpenCORE currently supports 3gpp fileformats in it's mp4
> > > > parser.
>
> > > > -Ravi
>
> > > > On Feb 23, 7:10 am, Andy Quan  wrote:
>
> > > > > Ravi,
> > > > > Is there any special concern about not supporting MP3 in MP4 format? 
> > > > > I mean
> > > > > whether there are any legal issues or is this simply a technical 
> > > > > issue? It
> > > > > seems MP3 held by MP4 is not a corner case.
>
> > > > > On Sat, Feb 14, 2009 at 12:15 AM, rktb  wrote:
>
> > > > > > On Feb 13, 4:34 am, Lucien  wrote:
> > > > > > > Dear all :
>
> > > > > > >  I have some problems to consult
>
> > > > > > > First,
> > > > > > > Can music player list a music file with amr-wb codec in 
> > > > > > > 3gp
> > > > > > > file
> > > > > > > format ?
> > > > > > > I have tried to push a test file(only audio with amr-wb 
> > > > > > > codec
> > > > > > > in 3gp) into
> > > > > > > sdcard , but music player list didn't show it
> > > > > > > In addition , when I used browser to connect a web 
> > > > > > > server, I
> > > > > > > can download
> > > > > > > the test file but still can't play it
> > > > > > > Moreover, if the video part of the test file(.3gp) is 
> > > > > > > encoded
> > > > > > > with h.263 and
> > > > > > > audio encoded with amr-wb codec, the media player can 
> > > > > > > display
> > > > > > > without audio
> > > > > > > output.
> > > > > > > What is the problem of this issue ?
>
> > > > > > I tried playing 3gp files with amr-wb encoded data. It works for me.
> > > > > > You need to file a bug with your specific clip so that somebody can
> > > > > > take a look at it.
>
> > > > > > > Second problem,
> > > > > > > Can music player play a music file encoded with mp3 codec 
> > > > > > > in
> > > > > > > mp4 file
> > > > > > > format ?
>
> > > > > > OpenCORE currently doesn't support mp3 codec in a mp4 container.
>
> > > > > > > At last, opencore 2.0 have been released.
> > > > > > > Are those two issues supported?
>
> > > > > > >Thanks!!
>
> > > > > > > Best regards,
> > > > > > > Lucien
>
> > > > > --
> > > > > Thanks,
> > > > > Andy- 隱藏被引用文字 -
>
> > > > - 顯示被引用文字 -- Hide quoted text -
>
> > - Show quoted text -
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: amr-wb codec in .3gp file format

2009-04-15 Thread Ravi

I don't know enough about QCELP to comment on how and if to modify the
mp4 parser. I can try to find out.

In general, the steps to add a new OMX codec would be:
   (i) Add the required format, if not present, in pvmi/pvmf/include/
pvmf_format_type.h.
   (ii) Add the mime type in the capability of PVMFOMXAudioDecNode()
or PVMFOMXVideoDecNode() [see the constructor].
if you are going to use the OpenCORE OMX-core ...
   (iii) Register the component in the omx registry - codecs_v2/omx/
omx_common/src/pv_omxregistry.cpp.
or
if you are going to provide your own OMX-core ...
   (iii) Have your OMX-core register this new component.
   (iv) Have your OMX-core dynamically loadable.

Not sure about your build failures. Can you post snippets of
failures?


On Apr 14, 11:27 pm, chuan  wrote:
> Dear Ravi:
> question 1:
> if i want intergrate other audio codec in opencore 2.0 such as
> qcelp in 3gp/3g2 file, how can i modify mp4 parser to let it work?
> this audio codec omx components are perpared, new codec types are
> already define in pvmi\pvmf\include\pvmf_format_type.h, any where i
> can add this codec is added
> if you have any sugestion , pls let me knew
> question2:
> to enable looger  i use cmd below
>rm -rf out/target/product/generic/obj/include/libpv
>make -j ENABLE_PV_LOGGING=1
>
>but i find it cannot compile through, a lot of killed appears, why?
>
>   looking farward for your kindly reply.
>
> On Feb 27, 10:27 pm, rktb  wrote:
>
> > It should be a new format in the existing container. Go through the
> > existing mp4 parser and try to understand how it is structured. We can
> > help you with specific questions.
>
> > Btw, I am assuming that you would be contributing this work back to
> > OHA community.
>
> > -Ravi
>
> > On Feb 27, 1:21 am, Lucien  wrote:
>
> > > Hi,
> > > Is there any documents,hints,or suggestions for me if I want
> > > to implement mp3 codec in mp4 ?
> > > How to start this work ?
> > > And is the effort hugeous ?
> > > In addition, does PV have plan or roadmap to support this
> > > functionality
> > > Thanks for your response
>
> > > On 2月24日, 下午9時13分, rktb  wrote:
>
> > > > Hi,
>
> > > > There is no special concern in supporting mp3 in mp4. It has not been
> > > > done yet. OpenCORE currently supports 3gpp fileformats in it's mp4
> > > > parser.
>
> > > > -Ravi
>
> > > > On Feb 23, 7:10 am, Andy Quan  wrote:
>
> > > > > Ravi,
> > > > > Is there any special concern about not supporting MP3 in MP4 format? 
> > > > > I mean
> > > > > whether there are any legal issues or is this simply a technical 
> > > > > issue? It
> > > > > seems MP3 held by MP4 is not a corner case.
>
> > > > > On Sat, Feb 14, 2009 at 12:15 AM, rktb  wrote:
>
> > > > > > On Feb 13, 4:34 am, Lucien  wrote:
> > > > > > > Dear all :
>
> > > > > > >  I have some problems to consult
>
> > > > > > > First,
> > > > > > > Can music player list a music file with amr-wb codec in 
> > > > > > > 3gp
> > > > > > > file
> > > > > > > format ?
> > > > > > > I have tried to push a test file(only audio with amr-wb 
> > > > > > > codec
> > > > > > > in 3gp) into
> > > > > > > sdcard , but music player list didn't show it
> > > > > > > In addition , when I used browser to connect a web 
> > > > > > > server, I
> > > > > > > can download
> > > > > > > the test file but still can't play it
> > > > > > > Moreover, if the video part of the test file(.3gp) is 
> > > > > > > encoded
> > > > > > > with h.263 and
> > > > > > > audio encoded with amr-wb codec, the media player can 
> > > > > > > display
> > > > > > > without audio
> > > > > > > output.
> > > > > > > What is the problem of this issue ?
>
> > > > > > I tried playing 3gp files with amr-wb encoded data. It works for me.
> > > > > > You need to file a bug with your specific clip so that somebody can
> > > > > > take a look at it.
>
> > > >

Re: Khronos Conformance test for PV codecs

2009-04-15 Thread Ravi

The last I checked...yes.

-Ravi

On Apr 15, 12:04 am, sonal gupta  wrote:
> HI Ravi,
> Could you please tell about this ?
> Regards,
> Sonal
>
> On Tue, Apr 14, 2009 at 11:04 AM, sonal  wrote:
>
> > Are PV audio/video OMX Codecs tested for Khronos conformance ?
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---



Re: What are those branches other than "master"

2009-04-15 Thread Ravi

Example: http://android.git.kernel.org/?p=platform/bionic.git;a=heads

It shows master, cupcake, donut, etc. as the heads or branch names.

-Ravi

On Apr 14, 10:40 pm, John Cola  wrote:
> Hi all
> Waht are those branches other than master?
> I only know cupcake, are there anyother else?
>
> BR
> John
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"android-framework" group.
To post to this group, send email to android-framework@googlegroups.com
To unsubscribe from this group, send email to 
android-framework+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/android-framework?hl=en
-~--~~~~--~~--~--~---