The simplest approach is just firing off an intent to the existing
camera app to take a picture. This requires the user to push the
shutter button.
If you want it purely under program control, you could have the
application snap the picture without the user pressing a button. It
just takes a bit
Right, which means that you need to open the resource file in your
process and pass the file descriptor in setDataSource instead of the
URI.
On Jan 27, 4:16 pm, Marco Nelissen wrote:
> Playback/decoding actually happens in a different process (the media
> server), so that process needs to have p
This message is off-topic, this forum is for application developers.
Try the android-framework list.
On Jan 26, 10:53 pm, bardshen wrote:
> Dear Sirs:
> when i try to build the Android source code download from internet
> use repo. i meet the following problem:
>
> external/opencore//an
If you are really ambitious, you can download the Cupcake source,
unhide all the new API's and build the SDK yourself. However, that is
a topic for a different list.
On Jan 27, 5:58 am, Jean-Baptiste Queru wrote:
> You can't. You'll have to wait for an SDK built form the cupcake code
> base, and
The camera application in the Cupcake branch does it somehow. You
could try looking at the code in packages/apps/Camera.
On Jan 26, 4:38 pm, GiladH wrote:
> Hey,
>
> Is there a way to identify which of the MediaStore images has been
> taken
> on 'this' device, as opposed to pictures exported/dow
tners
You can see the code in development on the Cupcake branch at
android.git.kernel.org.
On Jan 26, 11:31 am, benmccann wrote:
> I'm happy to hear future releases will support the ability to stream
> audio being recorded. Any ETA on this?
>
> On Dec 30 2008, 9:58 am, Dave Sparks wrote:
Would you please post a bug with specifics? Thanks!
On Jan 26, 12:03 pm, Tim Bray wrote:
> The section "Recording Media Resources"
> ofhttp://code.google.com/android/toolbox/apis/media.htmlseems to be out of
> date and wrong. I got working code
> fromhttp://rehearsalassist.svn.sourceforge.net
It is not possible to access call audio in the G1. This is a function
of the firmware in the radio and DSP and is currently not supported.
It is possible that future devices may enable this functionality, but
at the moment it is not part of a planned release.
On Jan 24, 2:32 am, javame_android
ne has not yet had a chance to close it.
>
> On Fri, Jan 23, 2009 at 9:08 AM, Dave Sparks wrote:
>
>
>
>
>
> > The camera service has no concept of foreground activity. It simply
> > gives the camera to the app that requests it. If another app currently
> > owns the
ect a fix
for this in a future OpenCore release. In the meantime, you need to be
a bit more conservative about the encoding of your content.
On Jan 23, 8:11 am, Allan Beaufour wrote:
> On 20 Jan., 18:22, Dave Sparks wrote:
>
> > We are working on improving media player error codes for t
If you are looking for the camera application source code, you'll find
it on android.git.kernel.org in the packages/apps project.
On Jan 23, 5:47 am, ANDREA P wrote:
> There is Camera application in Android Emulotor...
>
> The application code is Camera.java but I don't know where is it
The camera service has no concept of foreground activity. It simply
gives the camera to the app that requests it. If another app currently
owns the camera, it is notified when the camera is stolen.
I don't know all the rationale for that design decision. It's probably
not the way I would have des
We do not support native code on Android at this time, but we have
plans to publish a native SDK soon.
On Jan 22, 2:03 am, MRK wrote:
> I am creating an Android application which uses the JMF (SIP, RTP,
> JAIN). So i downloaded the JMF source code for some adhoc change to my
> application.
>
> T
I suspect that your problem is in some details that you haven't given
us yet.
How many media players are you creating at the same time?
On Jan 20, 10:47 pm, ena wrote:
> On Jan 21, 8:23 am, Dave Sparks wrote:> What is the
> format of the data in the WAVE file?
>
> i pl
calling auto-focus first. You don't want to move
the camera until after the shutter callback.
On Jan 21, 1:59 am, "mobilek...@googlemail.com"
wrote:
> Could you list the proper sequence as I'm having hard time working it
> out! Thanks
>
> On Jan 21, 3:21 am, Dave Spar
You need to call startPreview() before takePicture. You also need to
supply a PictureCallback function to receive the encoded JPEG. By
passing null, you are telling the camera service you don't want the
final JPEG image.
On Jan 21, 2:50 am, ANDREA P wrote:
> I want to use the camera to make a ph
Right now, the answer is no. Most cameras require that you go to
preview mode before you can take a picture so that the image processor
can grab some frames for auto-focus, white balance, etc.
I'll see if we can get a change into Cupcake that allows you to start
preview without a surface. That sh
o wait for
the second preview callback.
On Jan 20, 9:44 am, Pascal Merle wrote:
> On the G1!
>
> On 20 Jan., 18:22, Dave Sparks wrote:
>
> > Is this on the emulator or on G1?
>
> > On Jan 18, 12:40 pm, Pascal Merle wrote:
>
> > > I am standing a bit in the
What is the format of the data in the WAVE file?
OpenCore only supports 8- and 16-bit linear PCM.
On Jan 20, 11:59 am, ena wrote:
> plz help me outActually i want to play many file one by one in
> media player.im using that code
>
> MediaPlayer melodyPlayer=MediaPlayer.create(context, resI
; I registered the callback in CameraActivity.surfaceCreated() method.
>
> Could you advice on how to get that working? Thank you!
>
> On Jan 20, 5:09 pm, Dave Sparks wrote:
>
> > Camera.autoFocus(cb);
>
> > where cb is a callback function you supply that tells you focus is
> > succe
l samples.
>
> it seems that the only way to get this happening at the moment would be to
> hack the device and use the ALSA sound driver.
> What do you think ?
>
> thanks
> Matt
>
> On Sat, Jan 17, 2009 at 6:44 PM, Dave Sparks wrote:
>
>
>
>
>
> > OK,
There is no support for this use case right now.
On Jan 19, 5:23 am, RunX wrote:
> Thanks for the fast reply.
>
> The file is encrypted and I don't want to save the unencrypted file
> and open that.
> How could I do that?
>
> Thanks,
>RunX
>
> On Jan 14, 8:35 pm, rktb wrote:
>
> > If you ha
I think the problem is your path, you need a leading slash:
private String Path_Name = "/sdcard/audio4";
On Jan 17, 11:25 pm, "haze...@gmail.com" wrote:
> hey hi,
> i'm doing a similar program for my school too, but i cant make it to
> work. do you think you can help me? i think i have the cod
We are working on improving media player error codes for the next
major SDK release.
On Jan 19, 3:58 am, Allan Beaufour wrote:
> Hey
>
> Do anybody know if there is some documentation of the MediaPlayer
> error codes somewhere. The docs only has two of them, and I'm getting
> at least four other
Is this on the emulator or on G1?
On Jan 18, 12:40 pm, Pascal Merle wrote:
> I am standing a bit in the dark with the android.hardware.Camera
> class.
>
> What I tried to do is taking a picture without starting preview first:
> Camera mCamera = Camera.open();
> mCamera.ta
This API will be available in a future release.
On Jan 16, 5:22 pm, jjbunn wrote:
> Snap - me too! I'm interested in the answer to this as well. And, if
> access to
> the raw audio buffers is available, a knowledge of what format the
> audio data in would be essential.
>
> I'd prefer not to have
point to where this code is located (ie: what package)?
>
> -peter
>
> have you written up this FAQ?
> On Jan 7, 6:50 pm, Dave Sparks wrote:
>
> > Thereisnoplantosupportjavax.sound. I guess I need to writeup a
> > media FAQ because this question gets asked repeatedly.
&
At this time, there is no mechanism to get at the raw audio after it
is decoded.
On Jan 19, 11:03 am, Valeria wrote:
> Hi everyone,
>
> I'm developing a multimedia player and I want to create an equalizer
> in it , but I can't find any information about how to do it. Could
> anyone help me? I th
upcake solves our needs...
> > > I try this
> > > link,http://review.source.android.com/5290/diff/201/z474cb3132ea0f2114f9e5...,
> > > but when i import the projects into my workspace, many compile errors
> > > appear. I think that there is a lot of
No, this is not supported.
On Jan 20, 3:57 am, jalandar wrote:
> is it possible to take photo with emulator's camera?, if the pc(on
> emulator is there) having web cam
> thank you
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Go
Camera.autoFocus(cb);
where cb is a callback function you supply that tells you focus is
successful or not.
On Jan 20, 5:27 am, "mobilek...@googlemail.com"
wrote:
> Hi,
>
> My app is struggling to take focused shots. Is there a built in
> facility that sets an auto-focus property on the camera,
I was just guessing that maybe the OP's use case wasn't actually
recording small files, but rather he was running into a limitation in
the framework and searching for a workaround.
On Jan 16, 3:35 pm, Dan Bornstein wrote:
> On Thu, Jan 15, 2009 at 5:16 PM, Dave Sparks wrote:
&g
d level meter. For privacy reasons, we don't want
> the audio lying around on the disk.
>
> We could do it on the fly without recording to disk, however I don't think
> that is possible with the sdk ... is it ?
>
> Matt
>
> On Fri, Jan 16, 2009 at 12:16 PM, Dave Sp
OpenCore is the media playback and authoring engine used to render
most (but not all) of the media content for Android. As an application
developer, you don't access it directly, you access it through the
MediaPlayer interface.
On Jan 16, 11:18 pm, Tez wrote:
> Hi,
>
> Can anyone tell me what is
We have never tested that scenario on the G1 or the emulator and I
would be surprised if it worked. The hardware video decoder can only
support one decode at a time, which means the second stream would fall
back to a software codec. I'm not saying it can't work, but if it
doesn't, we probably won'
The takePicture() function captures an image.The callback occurs when
the picture has been captured and encoded. You aren't seeing the
callback in your app because you haven't started preview mode. It
cannot be used to capture an image from another application.
Even this did work, it doesn't nece
1. Yes, just call takePicture() from your onClickListener.
2. The preview will fill the surface you give it. Just make sure the
aspect ratio matches preview size. On the G1, this is currently hard-
coded to 480x320, so you want a 1.3 aspect ratio for your preview
surface.
On Jan 15, 1:05 am,
getParameters returns all the settings that are supported, and
possible values for some.
Be aware that many of these parameters are ignored on the G1 with the
current firmware.
On Jan 14, 10:33 pm, Wanted unique nickname
wrote:
> Where do these Camera Parameters come from? I see in the class i
I am pretty sure that won't work. Why do you want to record a bunch of
small audio files without dropping samples?
On Jan 14, 7:52 pm, flatmax wrote:
> Hi there,
>
> Has anyone managed to record audio to small files without dropping
> samples between ?
>
> perhaps it is possible to use two recor
Video recording is not supported in SDK 1.0.
On Jan 14, 1:55 pm, ANDREA P wrote:
> I want to make a program that recording a video in Android
>
> There is an example
> herehttp://code.google.com/intl/it-IT/android/toolbox/apis/media.html
>
> The class used is MediaRecorder.
>
> but ther
The Cupcake SDK will include a way of specifying the image quality.
On Jan 12, 8:52 am, Dave Sparks wrote:
> I'll have to look into it, but there should be an extra you can put in
> in the intent to specify image quality. If not, then we should add it.
>
> On Jan 10, 1:38
If it won't play from the SD card, it isn't going to stream either. I
read the thread you referenced in your original message and it leaves
out a lot of details. Information below is for H.264 AVC codec on the
G1:
Performance is rated for AVC baseline profile Level 1.3. A
conservative bit rate fo
I'd have to look at the code, but it doesn't surprise me. The phone
app/lock screen is a tricky bit of code and I don't think it was
intended for the ringtone to play while the screen is off.
Is there a reason you chose the Ringtone class to play your sound?
There are other options e.g. Notificat
Of the top of my head, I think you need to createThreadEtc with the
flag to indicate that your thread will call into Java.
On Jan 9, 2:00 pm, "redlight9...@gmail.com"
wrote:
> i am trying to make callbacks to my android application from a native
> C thread using JNI. however when i call FindCla
ce.
> I was guessing that I only had G1 specs. I can't seem to find
> anything definitive about the emulator.
> Unfortunately don't have an actual phone right now. I'm creating a
> demo for my boss, and will hopefully get one out of it if I can
> demonstrate it has so
a low image resolution. I want use an other
> method. Use the camera default application and listen the new picture
> added into the image provider. For that I looking for a solution to
> run camera application from my activity.
> Somebody know I do it ?
>
> On 9 Gen, 19:39, Dav
You should be able to write a file in your app's private data
directory.
On Jan 10, 2:19 pm, "hmmm" wrote:
> Hi,
>
> I can see that MediaRecorder class successfully performs audio recording when
> I specify an output file on the SD card, such as "/sdcard/newaudio.3gpp"
>
> But when I specify a
You left out a key detail - what is the frame size?
The specs you quoted are for a G1 device - the emulator is probably
not quite that good. In addition to running soft codecs in ARM
emulation, it's also doing color space conversion, scaling, and
rotation in ARM emulation.
On Jan 9, 4:22 pm, Bra
There is no platform support for this yet. You should take this up in
android-platform.
On Jan 8, 11:58 am, jas_h wrote:
> Hi,
>
> Is there a FM receiver or transmitter application available on the
> Android platform?
> What is preferred - HCI or I2C?
> What about RDS?
>
> Thanks.
--~--~
e to know how to integrate components like OpenMAX
> compliant codecs, parsers, protocols.. Is there any document or link
> that can guide me?
>
> Thanks and regards,
> -Vishwa
>
> On Jan 9, 8:56 am, Dave Sparks wrote:
>
> > MediaPlayer is a high level abstraction f
There is an interaction between the screen orientation and the camera
that causes problems for portrait mode. There will be a platform fix
in a future release, however it's possible that some devices will not
be able to support portrait mode.
On Jan 9, 5:37 am, jarkman wrote:
> For what little h
You want to broadcast "android.media.action.IMAGE_CAPTURE" intent.
On Jan 9, 9:55 am, fala70 wrote:
> Hi guys,
>
> Somebody know how can I call the camera capture application from my
> application ? I tried to use inject key event without success using
> KEY CAMERA BUTTON. If the user push key
raction between AudioTrack
> and MediaPlayer.
>
> Indeed there is no word about (low-level) audio mixing or MediaPlayer
> in
>
> http://android.git.kernel.org/?p=platform/frameworks/base.git;a=blob_...
>
> So, will AudioTrack and MediaPlayer "play" along nicel
MediaPlayer is a high level abstraction for OpenCore. There are no
immediate plans to expose any of OpenCore's lower level API's to Java.
On Jan 8, 1:43 am, vishy s wrote:
> Hi folks,
> I am trying find the relation between android media api's available
> (http://code.google.com/android/referen
There is a video player widget called VideoView for full screen video
with optional transport controls. Alternatively you can write your own
code around the MediaPlayer object.
On Jan 7, 2:14 pm, Delmarc wrote:
> I am working on a game where cut-scenes happen... but I have yet to
> read anything
You can download the Android source code from source.android.com and
build your JNI libraries against the gcc toolchain for testing.
On Jan 5, 5:19 pm, blues wrote:
> I have read all the post about JNI. And I know JNI is not offcially
> supported and google is working on the native SDK and I hav
You can convert the raw YUV to RGB and draw it on the surface
yourself. There is no API to send encoded frames directly to an
decoder and have them displayed and there are no plans to support
this.
On Jan 6, 4:34 am, iblues wrote:
> Hi all,
>
> In my application development, there is a requireme
There is no API for this.
On Jan 6, 7:30 am, Skywalker wrote:
> I need to play incoming audio data (raw) on phone speaker (not through
> back dynamic).
> I have not found any API for this. :(
> Help, please...
--~--~-~--~~~---~--~~
You received this message becaus
The G1 camera driver currently ignores the preview size and forces it
to 320 x 240.
On Jan 6, 11:19 am, Omar wrote:
> You can do:
>
> Camera.Parameters p = c.getParameters();
> p.setPreviewSize(width, height);
> c.setParameters(p);
>
> before you d
d MediaPlayer API changes well may be minimal, but the
> resulting functionality mentioned here appears quite significant for
> many of us - although I cannot judge if it also addresses Dan
> McGuirk's needs. Or am I confused about what is coming up in the short
> term?
>
&
I'll be the first to admit that our error reporting is bad right now.
Most likely is that it's unsupported file format or the file itself is
corrupt.
On Jan 7, 2:59 am, manoj wrote:
> Hello friends,
>
> I am trying to play some media files which are located in SDCard.
>
> While playing I got th
I believe this was deliberately left out of the code for 1.0. I'm not
aware of any plans to add it in Cupcake. I suggest you file a feature
request.
On Jan 7, 7:09 am, "Blake B." wrote:
> I'll ping the group one last time. Can anyone confirm that this is
> not possible?
>
> I looked at the late
There is no plan to support javax.sound. I guess I need to writeup a
media FAQ because this question gets asked repeatedly.
Cupcake has support for streaming PCM audio in and out of Java. It
also supports static buffers, i.e. load a buffer with sound data and
trigger (one-shot) or loop. Both stat
he phone
> (interface) ? It's my interpretation that if the phone app can handle
> incoming & outgoing calls, then it should be possible to extend it to
> include IVR-like features.
>
> Is there a more-appropriate forum to discuss this 'issue' /
> require
re there plans to update the API to allow more
> flexibility? I wouldn't really want to put a lot of effort into
> developing and maintaining this kind of scheme just to throw it away
> in a few months if the API is improved.
>
> On Jan 2, 10:32 am, Dave Sparks wrote:
>
>
apps
processor.
On Jan 2, 4:06 pm, "mashpl...@gmail.com" wrote:
> This is a mistake. There are many reasons why exposing in-call audio
> to the apps process is a good idea. Please reconsider your position
> on this.
>
> Kind Regards,
>
> Vince
>
> On Jan 3, 1:27
ample code for trying
> this proxy server workaround for playing audio?
>
> Thanks
>
> On Jan 2, 6:32 pm, Dave Sparks wrote:
>
> > I haven't looked at imeem, but one way to get around the issue is
> > using an HTTP proxy on the device. The proxy server could be buff
I haven't looked at imeem, but one way to get around the issue is
using an HTTP proxy on the device. The proxy server could be buffering
up the next stream while the current stream is playing.
On Dec 30 2008, 11:37 pm, Dan McGuirk wrote:
> Hi,
>
> I'm wondering if anyone knows how the imeem appl
There are no plans for exposing in-call audio to the apps processor.
In-call audio is controlled by the radio and typically not accessible
to the apps processor.
On Dec 26 2008, 10:00 pm, StevenS wrote:
> If I'm reading the API documentation correctly, neither the
> MediaRecorder.AudioSource nor
You are running a virtual Linux system on your workstation. It only
has access to the file systems that are mounted, which include the
fixed images required to boot and run the device and and an optional
virtual SD card image.
If you are really ambitious, you could modify the emulator code and
sy
oups.com
> [mailto:android-develop...@googlegroups.com] On Behalf Of Dave Sparks
> Sent: Tuesday, December 30, 2008 5:32 PM
> To: Android Developers
> Subject: [android-developers] Re: About speech recognizer
>
> There is no support for speech recognition in the current SDK.
>
&
There is no support for speech recognition in the current SDK.
On Dec 29, 10:26 pm, michael wrote:
> hi all
>Does this ability is already provided now?
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google
Groups "Android Dev
It's probably not really streaming audio. Some people are working
around the issue by "tailing" the file as it is being written.
On Dec 30, 5:03 am, FranckLefevre wrote:
> The application "Phone Recorder" available in Market softwares already
> does this pretty well.
> I don't know if sources ar
This forum is for application development. Try asking your questions
in android-framework.
On Dec 25, 3:28 am, "m.developer.software"
wrote:
> Hi,
>
> What does "DecodeFireWallPackets( )" do in pvmf_jitter_buffer_node.cpp
> and why is this requried? What is the concept behind using this?
>
> Ple
Audio streaming will be supported in a future release.
On Dec 24, 4:27 pm, "vitalii.mi...@gmail.com"
wrote:
> Is there any way to record audio stream and send streaming audio to
> network ?? Instead of recording to file.
--~--~-~--~~~---~--~~
You received this
There is no support for javax.sound in Android and there are no plans
to support it. We will have support for streaming audio in a future
release.
On Dec 29, 1:11 am, Lei wrote:
> I'm stuck on this.
> Help me please.
>
> On Dec 26, 2:23 pm, Lei wrote:
>
> > Hi, all
> > I want to capture the aud
It works for me - I have been using it extensively in the last few
weeks.
On Dec 23, 10:33 pm, "develop code" wrote:
> Hi,
>
> i tried the above modifications, i am not getting the logs from libraries
> (pv player). PV logger enabling method has changed? Any other methods to
> enable. I am tryin
I assume you mean an EQ insert in the final audio output. If so, the
answer is no, this is not part of the Cupcake release.
Cupcake provides support for streaming microphone input to a Java app
and streaming audio from a Java app into the audio mixer.
On Dec 24, 3:25 am, Aasha wrote:
> Would th
On Tue, Dec 23, 2008 at 4:37 PM, Dave Sparks wrote:
>
>
>
> > The MediaPlayer doesn't support streaming at all. You can get away
> > with a "pseudo-streamed" MP3 because the MP3 format was intended to be
> > broadcast and the MP3 parser is a bit more forg
The MediaPlayer doesn't support streaming at all. You can get away
with a "pseudo-streamed" MP3 because the MP3 format was intended to be
broadcast and the MP3 parser is a bit more forgiving about it than the
other parsers.
On Dec 23, 4:21 am, "Aldo Neto" wrote:
> Hi,
> I developed a quite sim
1. The Music Player stops playing when a call comes in and resumes
after the call is complete. With Cupcake, the music player will slowly
ramp up the volume after the call is completed.
2. The streaming audio interface is more flexible than InputStream.
You can build an InputStream interface on i
See this thread for using media player to play from APK resource
files:
http://groups.google.com/group/android-developers/browse_thread/thread/6668898856f8f090
On Dec 22, 1:20 pm, Toothy Bunny wrote:
> Hi All,
> After searching developer group, I found out the problem might be
> related to the
There is no "approved" way to distribute native code. It's not an
issue of feasibility but an issue of compatibility. If we publish an
API, we have to continue to support it for a long time. The native
API's are still in flux, and we don't want to publish them until we
are confident they are going
No there is no way to play a segment of a media file.
On Dec 19, 9:52 am, Kenneth Loafman wrote:
> Is there a way to set the length of the segment to play like
> setDataSource() has? I'm not seeing it.
>
> ...Thanks,
> ...Ken
>
> On Fri, 19 Dec 2008 09:03:30 -0800 (PS
As a workaround, you can use seekTo followed by start to start
playback partly into a file, and then call getCurrentPosition
repeatedly until you get to the point where you want to stop and then
call stop or pause. It's not going to be very precise.
On Dec 19, 12:27 pm, Dave Sparks wrote:
setDataSource() with offset is for playing an embedded media file. In
other words, there must be a valid WAVE header at the specified
offset. An example use case is a resource file that contains many WAVE
files with a table of contents at the beginning.
You can call seekTo() to start playback at
I can only guess that the RTSP client in OpenCore didn't like
something it found the DESCRIBE response. I'm not sure that it can
handle broadcast mode. Maybe someone from PV will be able to provide
some insight.
Did you get an error message back from the MediaPlayer through
onErrorListener?
On D
The media scanner automatically extracts metadata from any file that
it recognizes on the SD card. Can you be more explicit about your use
case?
On Oct 24, 6:45 am, CiprianU wrote:
> Hy guys,
>
> Can you tell me how can I extract the ID3 tags from an mp3 file, using
> Android of course.
>
> Than
Video telephony was not a priority for the markets that we launched in
this year. OpenCore will have a H.324M stack in 2009.
However, VT is a complex function that requires close cooperation
between RIL and media stack. Even though Android will have an open
source VT stack, there will still be a
The media server does not have access to your application's data
directory for security reasons. If you want the media server to play a
file in your data directory, you need to open the file in your app and
use the setDataSource(fd) method to pass open file descriptor to media
player.
Alternative
It sounds like your application is not releasing its media player
resources. When you're done playing a sound, you need to call the
MediaPlayer.release() method. If you're playing a lot of sounds
rapidly, the garbage collector won't be able to keep up.
Arguably, the runtime shouldn't reboot. I th
The radio firmware generates the tones for the far end. The local
tones are generated algorithmically on the app processor. See
ToneGenerator.
On Dec 15, 10:55 pm, Mihai wrote:
> Hi,
>
> I would also be interested in something like this - so I would like
> to ask if the G1 has an oscillator an
When you are in a call, press the JKL tab at the bottom of the screen
to bring up the DTMF dial pad. This will generate DTMF tones over the
radio as well as generate tones for the user to hear locally (unless
the user has disabled the local tones in the Sound/Display settings).
On Dec 15, 8:49 pm
If you're asking how to use the G1, you should post your question in
android-discuss. This forum is about programming for Android, the
software platform that runs on the G1.
Having said that, I think you're looking for the touch tone dialer
which is the little JKL tab at the bottom of the screen
http://code.google.com/android/reference/android/content/Intent.html#ACTION_MEDIA_SCANNER_SCAN_FILE
On Dec 14, 2:56 pm, jphdsn wrote:
> hi,
>
> how put images in
>
> MediaStore.Images.Media.INTERNAL_CONTENT¨_URI
>
> thanks
--~--~-~--~~~---~--~~
You received this m
Some of the ringtones contain special metadata to make them loop.
These sounds will play forever and aren't appropriate for
notifications. Unless you want really annoy your users. :)
The notifications directory has sounds that don't loop.
On Dec 13, 12:02 am, elDoudou wrote:
> Thank you for the
We are aware of the issues in the Market and they will be addressed
soon.
On Dec 12, 7:42 am, Jeff wrote:
> Is there a way that a developer can speak to someone at google?
>
> Is there a way that a developer can make a request to email? e.g.,
> removing rating that contain abusive language form
The length of the notification sound determines the length. You can
select persistent if you want the sound to play until the user
dismisses it.
You can cancel a notification, so you could conceivably send your self
a delayed message to cancel a notification after a period of time.
On Dec 12, 2:
I'm beginning to grok the fullness... :)
You could use the ALARM stream, though the user might have silenced
alarms and then nothing will be heard.
We are adding the ability to send events to the music player to tell
it to play, pause, skip, etc. to support AVRCP. This will come in a
release in
Do you really mean YouTube player, or are you referring to the media
player?
In any case, the emulator is using the ARM-optimized software codecs
running in QEMU ARM emulation on your workstation. The performance is
not going to be spectacular.
On the G1, the AVC codec runs on the DSP, so the pe
201 - 300 of 341 matches
Mail list logo