[android-developers] Re: putInt force closes
JP, You must use the permission request WRITE_SETTINGS, not WRITE_SETTING. uses-permission android:name=android.permission.WRITE_SETTINGS/ - David On Sunday, July 25, 2010 11:11:14 PM UTC-7, scadaguru wrote: I am trying to write the system setting using: Settings.System.putInt(getApplicationContext().getContentResolver(), Settings.System.WIFI_SLEEP_POLICY, Settings.System.WIFI_SLEEP_POLICY_NEVER_WHILE_PLUGGED); But it force closes my application. Any Idea? My has user following permissions: uses-permission android:name=android.permission.WRITE_SETTING/ uses-permission android:name=android.permission.RECEIVE_BOOT_COMPLETED/ uses-permission android:name=android.permission.ACCESS_WIFI_STATE/ uses-permission android:name=android.permission.UPDATE_DEVICE_STATS/ uses-permission android:name=android.permission.CHANGE_WIFI_STATE/ uses-permission android:name=android.permission.WAKE_LOCK/ Thanks, JP -- You received this message because you are subscribed to the Google Groups Android Developers group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en
[android-developers] Re: SDK 1.6, ADT 0.9.3, Eclipse 3.4.2: Problem with Android Editors
I have the same problem. Reversion to Eclipse XML Editors and Tools v3.0.5 or v3.0.4 did not help or change the symptoms. My work-around is to use a text editor. Any solution? - David -- You received this message because you are subscribed to the Google Groups Android Developers group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en
[android-developers] Re: What's slow on Android?
If you can have just 3 things be much faster, what will they be? I need to make a custom camera filter display based on the camera preview run faster than I am currently able to do. We must get the display rate up to 5 to 10+ frames per second, through the G1 can barely display 1 fps. In order to do this, here are three requests: 1. Faster delivery of individual preview frames to the preview callback onPreviewFrame. 2. Support another data format that I can decode more quickly than the current YCbCr_422_SP data and support configurable preview frame size. 3. Faster looping through every pixel in the image in order to process per-pixel calculations more quickly. P.S. iPhone suffers the same issues, but many J2ME devices do not. As far as I am concerned, this is strictly an issue of camera API support in the JVM. - Regards, David Manpearl On Jan 12, 2:21 pm, Tomei Ningen tomei.nin...@yahoo.com wrote: Hello Android developers, We are building an Android-based device, and would like to know what we should try to improve in terms of performance. If you can have just 3 things be much faster, what will they be? please be specific (instead of graphics is too slow, something like drawing red poker dots on translucent canvas is slow) and why are they important (a real-world use case would be good). Thanks! --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Android Developers group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers-unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en -~--~~~~--~~--~--~---
[android-developers] Re: processing camera preview before display
Some notes on this thread and some code... 1. You need the Surface and it must be as large as the preview image, though it doesn't have to be visible. Don't set SURFACE_TYPE_PUSH_BUFFERS. 2. Don't try to process the preview in the preview callback. There isn't enough time and you will crash the system. The camera won't work until after reboot. 3. In a separate Thread, decode the preview data like this: http://groups.google.com/group/android-developers/msg/d3b29d3ddc8abf9b - Hope this helps, David On Dec 16, 11:42 am, Loechti loec...@gmail.com wrote: Hi, could you post your code? i tried it as described above but it crashes on the phone every time :/ thanks :) On 16 Nov., 20:39, Sean seaned...@yahoo.com wrote: On Nov 14, 8:49 am, Dave Sparks davidlspa...@gmail.com wrote: You might be able to get away with not passing a surface into the camera object. I don't think we've ever tested that. If it doesn't work, let me know and we'll file a bug to get if fixed. It appears that not passing a surface into the camera object does cause a crash. In that case, you would take the preview frame, do your filter operation on it, and draw it directly to your own view. Expect the preview to be very laggy though. I was finally able to give the camera a surface that is 0x0 pixels, and then use the preview callback image. The lag isn't a serious problem for my application, although it is unfortunate for those like Blindfold doing augmented reality apps. Dave, I just found a post you made on Issue 1271. Very helpful for when I try to crack open the actual data - thanks!: (http://code.google.com/p/android/issues/detail?id=1271): The actual format is YCbCr420 semi-planar. The Y-plane is followed by an interleaved CbCr plane. This is a native format used by the G1 camera, hardware video codecs, and display processor. It enables hardware color conversion in the display processor which is a significant savings over software CC.- Hide quoted text - - Show quoted text - --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Android Developers group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers-unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en -~--~~~~--~~--~--~---
[android-developers] Re: Camera.PreviewCallback IOException
Pure, Filter: 1. Your CamApp Activity must extend Surface Holder, and you shouldn't start the camera until the surface is created. 2. G1 doesn't support 100, 100 preview. It only does 480 x 320, but your request will not cause any problems, it will just get ignored. When you finally do receive the preview callback check the actual frame size in onPreviewFrame. 3. The byte[] array data sent to onPreviewFrame is in YCbCr_422_SP format. No other formats are available, even if you attempt to set them. The data is described here: http://groups.google.com/group/android-developers/msg/d3b29d3ddc8abf9b 4. Don't try to decode the data in onPreviewFrame. There isn't enough time - you will hose the camera if you try to hold up the system in that function for so long. Copy the data to your own buffer and decode in a separate Thread. Overlay: This is a completely different story and interesting to me. I have not done this yet on Android but it works on many other phones - even Java on occasion. I think this will work in Android. In my attempts on many phones the screen flickers or flashes badly because it is impossible to synchronize the video preview with your paint/draw/blit/ etc. Good luck. Hope this helps. - Sincerely, David Manpearl On Dec 10, 12:10 pm, pure per.arn...@anyplanet.com wrote: I am trying to use the PreviewCallback but i get an IOException saying: startPreview failed. I run the emulator in the debugger. I am pretty stuck here. Are there any examples using the PreviewCallback. It would be so nice to get this working since i have 4 great ideas for applications that all need to use the camera preview and apply a filter or overlay. Thanx. The code that fails: public class CamApp extends Activity implements Camera.PreviewCallback { �...@override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); SurfaceView surfView = new SurfaceView(this); setContentView(surfView); Camera cam = Camera.open(); Camera.Parameters parameters = cam.getParameters(); parameters.setPreviewSize(100, 100); cam.setParameters(parameters); cam.setPreviewCallback(this); cam.startPreview(); } �...@override public void onPreviewFrame(byte[] data, Camera camera) { } } Stack Trace Code: Thread [3 main] (Suspended (exception RuntimeException)) ActivityThread.performLaunchActivity(ActivityThread$ActivityRecord) line: 2140 --- Here is where i can see the IOException ActivityThread.handleLaunchActivity(ActivityThread$ActivityRecord) line: 2156 ActivityThread.access$1800(ActivityThread, ActivityThread $ActivityRecord) line: 112 ActivityThread$H.handleMessage(Message) line: 1580 ActivityThread$H(Handler).dispatchMessage(Message) line: 88 Looper.loop() line: 123 ActivityThread.main(String[]) line: 3742 Method.invokeNative(Object, Object[], Class, Class[], Class, int, boolean) line: not available [native method] Method.invoke(Object, Object...) line: 515 ZygoteInit$MethodAndArgsCaller.run() line: 739 ZygoteInit.main(String[]) line: 497 NativeStart.main(String[]) line: not available [native method] --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Android Developers group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers-unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en -~--~~~~--~~--~--~---
[android-developers] Re: Android Camera Preview Filter Using Camera.PreviewCallback.onPreviewFrame
I decoded the RGB color data from the Android Camera PreviewCallback onPreviewFrame() frame. My function that decodes the YUV byte[] buffer from the preview callback and converts it into an ARGB_ int[] buffer is presented below. The luminance buffer takes up the first width * height bytes of the byte buffer and can be displayed as a gray-scale image if desiered. The chromonance follows with each U or V value representing a 2x2 square region of 4 luminance values. The bytes are packed as follows (display in monospaced font to view as grid): Y1a Y1b Y2c Y2d Y3e Y3f Y1g Y1h Y2i Y2j Y3k Y3l Y4m Y4n Y5o Y5p Y6q Y6r Y4s Y4t Y5u Y5v Y6w Y6x U1 V1 U2 V2 U3 V3 U4 V4 U5 V5 U6 V6 I strongly believe that Google, you should have told us this yourselves and saved me several days. This code has been tested on the HTC G1 device processing 480x320 size frames. Similar functions downrez in the same single pass through the image to a smaller frame size for full display on the screen at a faster rate. Note: this processing must be done in a separate Thread from onPreviewFrame(). // decode Y, U, and V values on the YUV 420 buffer described as YCbCr_422_SP by Android // David Manpearl 081201 public static void decodeYUV(int[] out, byte[] fg, int width, int height) throws NullPointerException, IllegalArgumentException { final int sz = width * height; if(out == null) throw new NullPointerException(buffer 'out' is null); if(out.length sz) throw new IllegalArgumentException(buffer 'out' size + out.length + minimum + sz); if(fg == null) throw new NullPointerException(buffer 'fg' is null); if(fg.length sz) throw new IllegalArgumentException(buffer 'fg' size + fg.length + minimum + sz * 3/ 2); int i, j; int Y, Cr = 0, Cb = 0; for(j = 0; j height; j++) { int pixPtr = j * width; final int jDiv2 = j 1; for(i = 0; i width; i++) { Y = fg[pixPtr]; if(Y 0) Y += 255; if((i 0x1) != 1) { final int cOff = sz + jDiv2 * width + (i 1) * 2; Cb = fg[cOff]; if(Cb 0) Cb += 127; else Cb -= 128; Cr = fg[cOff + 1]; if(Cr 0) Cr += 127; else Cr -= 128; } int R = Y + Cr + (Cr 2) + (Cr 3) + (Cr 5); if(R 0) R = 0; else if(R 255) R = 255; int G = Y - (Cb 2) + (Cb 4) + (Cb 5) - (Cr 1) + (Cr 3) + (Cr 4) + (Cr 5); if(G 0) G = 0; else if(G 255) G = 255; int B = Y + Cb + (Cb 1) + (Cb 2) + (Cb 6); if(B 0) B = 0; else if(B 255) B = 255; out[pixPtr++] = 0xff00 + (B 16) + (G 8) + R; } } } On Dec 1, 8:52 am, Dave Sparks [EMAIL PROTECTED] wrote: The G1 preview format is YUV 420 semi-planar (U and V are subsampled by 2 in both X and Y). The Y plane is first, followed by UV pairs - I believe the U sample comes first in the pair. Technically it's YCbCr 420 semi-planar, but very few people use that term. On Nov 26, 6:27 pm, dmanpearl [EMAIL PROTECTED] wrote: Hello Blindfold, Thanks for your help. I solved the user interface problems I was experiencing by using a separate thread to do my image processing. I'm still using an ImageView, and without problems. Perhaps I will try a Canvas in a SurfaceHolder later in this exercise to compare speeds. MORE ON THE CAMERA LIVE PREVIEW FILTERED DISPLAY CAPABILITY As you know, I want to display live camera data through a custom filter. I got most of the way through theYCbCr_422_SPdata buffer returned to Android's Camera.PreviewCallback onCameraFrame() callback function, and now I am looking for help decyphering the U, V portion of the buffer. I verified that the first (width*height) bytes are simple Y luminance values that can be displayed (via Bitmap and ImageView) to make a viable gray-scale image. The total number of bytes are (width * height * 3 / 2). The remaining 1/2 image bytes are clearly used to store U, V (Cb, Cr) data. Therefore, there are 1/4 image bytes for each U, V component (i.e. each U, V component is used for 4 pixels of the image). This looks like 411 or 420 data, not 422, but we have bigger fish to fry. I cannot determine if the U V data is aligned adjacently, in alternating rows, or in squares as described in this Wikipedia graphical description: http://en.wikipedia.org/wiki/Image:Yuv420.svg. Once I finally determine the structure of the U, V data, I have several equations to convert from YUV to RGB and I have tried many ways of combining the UV data with the luminance data of the first 2/3 of the buffer to no avail. So far I can only display mono
[android-developers] Re: Android Camera Preview Filter Using Camera.PreviewCallback.onPreviewFrame
Hello Blindfold, Thanks for your help. I solved the user interface problems I was experiencing by using a separate thread to do my image processing. I'm still using an ImageView, and without problems. Perhaps I will try a Canvas in a SurfaceHolder later in this exercise to compare speeds. MORE ON THE CAMERA LIVE PREVIEW FILTERED DISPLAY CAPABILITY As you know, I want to display live camera data through a custom filter. I got most of the way through the YCbCr_422_SP data buffer returned to Android's Camera.PreviewCallback onCameraFrame() callback function, and now I am looking for help decyphering the U, V portion of the buffer. I verified that the first (width*height) bytes are simple Y luminance values that can be displayed (via Bitmap and ImageView) to make a viable gray-scale image. The total number of bytes are (width * height * 3 / 2). The remaining 1/2 image bytes are clearly used to store U, V (Cb, Cr) data. Therefore, there are 1/4 image bytes for each U, V component (i.e. each U, V component is used for 4 pixels of the image). This looks like 411 or 420 data, not 422, but we have bigger fish to fry. I cannot determine if the U V data is aligned adjacently, in alternating rows, or in squares as described in this Wikipedia graphical description: http://en.wikipedia.org/wiki/Image:Yuv420.svg. Once I finally determine the structure of the U, V data, I have several equations to convert from YUV to RGB and I have tried many ways of combining the UV data with the luminance data of the first 2/3 of the buffer to no avail. So far I can only display mono-chrome. If you or others on this list can decode the Android YCbCr_422_SP data, please post the solution as soon as possible. Your efforts and generosity are greatly appreciated. I am convinced that representatives from Google/Android and others monitoring this list know how to do this. Please share the information. It is crutial to our project. I do not care about the Emulator and it's different encoding. I realize that Google is probably waiting to implement a unified solution and share it through an API update, but we cannot wait. - Thank you, David Manpearl On Nov 26, 10:23 am, blindfold [EMAIL PROTECTED] wrote: Hi David, I can't seem to make coexist: SurfaceHolder for the camera ImageView for the filtered Bitmap to display. ... Do you know why I can't make the Camera's Surface and an ImageView Bitmap simultaneous members of the same active ViewGroup? I do not use ImageView myself so I cannot really judge your problem. I draw my filtered Bitmap to a Canvas in a SurfaceView. No ImageView anywhere. Regards --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Android Developers group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/android-developers?hl=en -~--~~~~--~~--~--~---
[android-developers] Re: Android Camera Preview Filter Using Camera.PreviewCallback.onPreviewFrame
Peter and Jeff, Thanks for each of your fantastic help, I am very close to achieving a filtered preview display as required for my project - Modification of the Android Camera Display from onPreviewFrame. Here is my current stumbling block: I can't seem to make coexist: SurfaceHolder for the camera ImageView for the filtered Bitmap to display. One must assign the Camera preview display into a SurfaceHolder in order to get preview callbacks: mCamera.setPreviewDisplay(surfaceHolder); The SurfaceHolder's Surface must be a child of the current View in order to receive surfaceCreated, surfaceChanged, and surfaceDestroyed callbacks. I have been using an ImageView object to display the Bitmap into which I am writing my decoded preview data via an int[] array. Therefore, both the SurfaceView and the ImageView seem to have to be children of the same ViewGroup, which is currently displayed. However, as soon as I add the ImageView to the ViewGroup via: linearLayout.addView(imageView); or absoluteLayout.addView(imageView); etc. My Activity stops receiving Camera preview callbacks onPreviewFrame (). LogCat indicates an AndroidRuntime ERROR: thread attach failed error. I am not sure if this is related. I have tried setting the Surface to the same size as the Camera preview size setting in the surfaceChanged callback: Camera.Parameters parameters = mCamera.getParameters(); final Size preSz = parameters.getPreviewSize(); mHolder.setFixedSize(preSz.width, preSz.height); I tried to create and manage the Surface (needed by Camera preview) outside of the SurfaceHolder callbacks using the Activity's onResume and onPause methods. I did this so that I would not have to put the SurfaceView into the active display hierarchy (i.e. remove as child of current View) so that it would not conflict with the ImageView as above. Unfortunately, this causes a Camera exception app passed NULL surface - directly after my call to: surfaceView = new SurfaceView(this); mHolder = surfaceView.getHolder(); // mHolder.addCallback(this); Do you know why I can't make the Camera's Surface and an ImageView Bitmap simultaneous members of the same active ViewGroup? Thanks again. Your generous responses help me more than you might realize. - Regards, David Manpearl On Nov 25, 12:47 am, blindfold [EMAIL PROTECTED] wrote: How come I recognize so many of these findings? ;-) 4. I believe that processing breaks down whenever I spend too much time in the onPreviewFrame function. That's what I observed too: so do the heavy duty image processing outside onPreviewFrame(), with supplementary frame skipping and subsampling as needed. Then there is no problem quitting the app either. I'd incurr an extra copy of the YUV_422 byte[] buffer from onCameraFrame into the Thread prior to processing. Yes, you need to take care of the limited data[] lifetime in onPreviewFrame(). Between this and skipping frames that overlap the frame-processing thread, this might drastically reduce the filter/display speed. I doubt that the buffer copying matters (especially when using arraycopy) as compared to the time lost in decoding the preview image (pixel-by-pixel) for lack of an Android API with a native decoding method for the preview format(s). I've looked at various offsets within each expected set of 6 bytes for each 2 pixels. For the Y (luminance) part just use the first block in data[]: the colors are not interleaved with Y; Y is in one block of bytes with offset zero in data[], with one byte per pixel. So decoding Y is easy, and identical for emulator and G1 (except for the stride due to preview image size differences: default 176x144 vs 480x320). Regards --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Android Developers group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/android-developers?hl=en -~--~~~~--~~--~--~---
[android-developers] Re: Android, A Programmer's Guide by Jerome (J.F.) DiMarzio
Lots of Android Development book choices on Amazon: Available now: Professional Android Application Development by Reto Meier (Paperback - Nov 24, 2008) $29.69 The Busy Coder's Guide to Android Development by Mark L. Murphy (Paperback - Jun 25, 2008) $27.93 ANDROID A PROGRAMMERS GUIDE by Jerome DiMarzio (Paperback - Jul 30, 2008) $26.39 Android Essentials (Firstpress) by Chris Haseman (Paperback - Jul 21, 2008) $17.99 Available for pre-order: Hello, Android: Introducing Google's Mobile Development Platform by Ed Burnette (Paperback - Dec 28, 2008) $21.75 Pro Android: Developing Mobile Applications for G1 and Other Google Phones (Pro) by Sayed Y. Hashimi and Satya Komatineni (Paperback - April 20, 2009) $29.69 Unlocking Android by Frank Ableson, Charlie Collins, Robi Sen, and Robert Cooper (Paperback - Mar 28, 2009) $26.39 Android Application Development: Programming with the Google SDK by Rick Rogers and John Lombardo (Paperback - Mar 15, 2009) $26.39 - David On Nov 24, 7:19 pm, [EMAIL PROTECTED] [EMAIL PROTECTED] wrote: Thanks to everyone who answered this post. I have gone to the bookstore and purchased Professional Android Application Development by Meier. I have also gone to several of the sites listed. I now feel as if I'm back on track. It's been a week since I've been able to do anything in Android as I've been busy and really felt stalled with that other book. So once again, thank you to everyone who offered their support. I hope I can write an app that will make you all happy that you helped me; and I hope I make a whole lot of money too :) --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Android Developers group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/android-developers?hl=en -~--~~~~--~~--~--~---
[android-developers] Android Camera Preview Filter Using Camera.PreviewCallback.onPreviewFrame
Will someone please send an example of how to process the camera preview with a filter in the camera preview callback: Camera.PreviewCallback.onPreviewFrame. Here is what I learned from posts on this forum: 1. Don't use SURFACE_TYPE_PUSH_BUFFERS // Comment out PUSH_BUFFERS in order to hide surface. // mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS); This solution does hide the camera preview - good. I'm not sure which surface type to use instead. I've been using nothing or SURFACE_TYPE_NORMAL, which I assume is the default. 2. The preview data is delivered in a version of YCbCr_422_SP or YCbCr_420_SP, which is different on the emulator and on the G1 device. I have been receiving buffers that are exactly 1.5 times the size of the image (i.e. bytes = w * h * 3 / 2). I have not yet decoded the data, as I still have bigger fish to fry. This is what I have learned: 1. I can generate a small int[] data buffer using 4 byte Colors. I can turn it into a Bitmap with Bitmap.setPixels. I can display it from within the onPreviewFrame callback. The goal here is to convert the YCbCr_422_SP byte data into an int[] array with per-pixel calculations, convert that into a Bitmap, and display through an ImageView object. 2. If I process every frame, then the application on emulator or G1 cannot trap the Back key and never quits. It finally exception terminates. Furthermore, for the G1 device, the camera is left in a state that it will never run again by any application without power cycle of the device. This makes sense considering that mCamera.stopCamera() or mCamera.release() never get called. 3. I can generate and display a 160x120 image at almost 10 frames per second, even considering the skipped frames. So far so good. 4. G1 phone always delivers 480x320 size frames. The emulator delivers an odd size that is taller than it is wide (I don't remember exactly - something like 312x80 - yes, that bizarre). This is checked by reading the parameters in the camera object returned to the onPreviewFrame callback: Camera.Parameters params = camera.getParameters(); Camera.Size camSz = params.getPreviewSize(); Here are my problems: 1. Test generated data displays correctly and fast (I alternate colored frames and can see them switching quickly on the screen of G1, slower on emulator). When I try to convert the byte data from the camera all of it looks black. I've looked at various offsets within each expected set of 6 bytes for each 2 pixels. I need help with the decoding, but I worry that I am receiving empty buffers. 2. If I process larger frames or do too much processing or Log() output in the onPreviewFrame callback, then the application stops receiving any callbacks. 3. I have tried using a boolean lock variable to ensure that there are no reentrant calls to onPreviewFrame. This did not help - besides, I don't think Android would enter the callback this way. 4. I believe that processing breaks down whenever I spend too much time in the onPreviewFrame function. Therefore, I think I could solve my problems with a Thread for either the camera, the processing of frames captured in the onPreviewFrame callback, or both. Unfortunately, in order to process each frame in a different thread, I'd incurr an extra copy of the YUV_422 byte[] buffer from onCameraFrame into the Thread prior to processing. Between this and skipping frames that overlap the frame-processing thread, this might drastically reduce the filter/display speed. So, as a new Android programmer (who finds it a lot more like J2ME than he had hoped) before launching into another several days of battling with Bitmaps, Viewables, Surfaces, and now Threads, would someone please give me their solution or partial solution to for displaying data from onPreviewFrame. I want the mechanics of Android architecture as much or more than I'd like the YUV decoder. Several people on this list have done this and complained about speed. One poster has so far only extracted the luminance data. All these solutions are good enough for me. Please send via this list or direct to dmanpearl_at_gmail_dot_com. Thank you ever so much! - David --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Android Developers group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/android-developers?hl=en -~--~~~~--~~--~--~---
[android-developers] How to Display Captured Camera Frame
I am trying to capture a picture from the Android camera - preferrably raw, but jpg ok - and then display it on the G1 screen. My problem is that only black frames are displayed. The preview does work. This code works on the emulator, but displays a black frame on the G1 device. The emulator displays an interesting picture of the green android robot and a blackberry like phone. Here is the code I am using. I have tried many options for the Camera parameters including various PixelFormats and resolutions. In addition to the code below, I have tried various versions with BitmapDisplayables and ByteArrayInputStreams. Here is the code I am currently using to attempt to display the image: Camera.PictureCallback pictureCallback = new Camera.PictureCallback() { public void onPictureTaken(byte[] data, Camera camera) { Bitmap bm = BitmapFactory.decodeByteArray(data, 0, data.length); ImageView iv = new ImageView(cx); iv.setImageBitmap(bm); linearLayout.addView(iv); } }; Does anyone have any idea why this code is not working Thank you for any help you can offer. - Sincerely, David Manpearl --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Android Developers group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/android-developers?hl=en -~--~~~~--~~--~--~---