[android-developers] Re: Accelerometer use (G's and frequency)
1) and 2) are hardware dependent. If you need something with higher acceleration range than standard handsets, you'd probably have to have it built. Accelerometer - chips are made to many different specs, but generally, the higher sensitivity, the lower the range. Remember also that there can be both Acceleration and Gravity sensors on android devices, one measures zero acceleration when lying still, one measures 1G (~ +9.81m/s^2) when lying still. The update-speed is also determined by the specific device. 3) An accelerometer is probably not a good way to detect a golf-swing - at least not alone. Acc-sensors measure linear accelerations. If you combine it with a gyroscope-sensor you could test for a simultaneous rotation and high-G type of movement. Otherwise it would be difficult to determine if the user is merely shaking his phone... Also be aware that even a just couple of G's are quite a bit! - Android devices are really meant to be weaponized - not yet at least... Cheers Christian On Monday, July 23, 2012 7:35:28 PM UTC+2, JAM wrote: Hello all. Hope someone can help. 1) Is there a way to have a program read higher than 1-2gs? I had a beta version built and it doesn't seem to have the ability to read more than 1g on some of the phones I've tested it on. 2) Is there a way to get the phone to receive more samples than the 20-40ms it seems to do when set to fastest? I need a more accurate profile of the accelerometer output than I'm getting. 3) If I gave someone a specific profile of something I'd like the accelerometer to find (a spike that gets above 2g and stays there for a duration of 5ms or longer -- like a golf swing), could it be programed to see such an incident consistantly? -- You received this message because you are subscribed to the Google Groups Android Developers group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en
[android-developers] Re: android based robot
I tried once, using the bluetooth of my smartphone to communicate with an arduino-based motorcontroller - Stopped after a bit of experimenting though. On Tuesday, October 2, 2012 10:26:05 AM UTC+2, Rajat Trivedi wrote: hey..i want to develop a android based mobile robot..so can some body tell me from whr i hav to strt...i m just confused.. -- You received this message because you are subscribed to the Google Groups Android Developers group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en
[android-developers] Video color effects
Hi, I'm working on an app that among other things will allow the users to make short videos, apply funny effects to them and share them with one another. To begin with, i'm looking for simple color-effects like grayscale sepiatoning, and such. All this would be very simple by using the camera-class which can apply the color effects at recording-time - at least most phone's cameras can - i've tested some using *Camera.getParameters().getSupportedColorEffects(); *. But the thing is: i need to do it *after *the recording has been done: the user would open a video, and choose among a set of effects to apply. I can't for the love of *** find a good way to do this. Android doesn't seem to include any videoutilities in the sdk. The * android.media.effect* package can do some effects, but only *backdropper *for videos, the rest are for images. Extracting bitmaps from the surfaceview of a videoview during playback doesn't work, it just returns an all-black bitmap. It seems like there's no way to intercept the datastream between the storage and the screen. and apply the effects there. I've started to look into using the FFmpeg library to decode a video file so i can get access to the data, but that requires quite a bit of native coding, and also requires separate compiles for various CPU architectures, so it's very messy. I thought that as the camera can apply these effects (on a Sony LT26i: *none, mono, negative, solarize, sepia, posterize*), perhaps one could feed the recorder with a videostream not from the camera, but from the memory, and by that way use a stored video file? Do anyone know if there is a way to apply effects to a video - after it has been recorded? Christian. -- You received this message because you are subscribed to the Google Groups Android Developers group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en