Re: [PATCH 6/6] i2c: davinci: bus recovery procedure to clear the bus
On 02/08/2010 09:33 PM, Nori, Sekhar wrote: On Mon, Feb 08, 2010 at 20:43:27, Philby John wrote: Hello Sekhar, On 02/08/2010 04:05 PM, Nori, Sekhar wrote: +static void generic_i2c_clock_pulse(unsigned int scl_pin) +{ + u16 i; + + if (scl_pin) { + /* Send high and low on the SCL line */ + for (i = 0; i 9; i++) { + gpio_set_value(scl_pin, 0); + udelay(20); + gpio_set_value(scl_pin, 1); + udelay(20); + } Before using the pins as GPIO, you would have to set the functionality of these pins as GPIO. You had this code in previous incarnations of this patch - not sure why it is dropped now. Don't seem to remember having the code in the old versions at least not in generic_i2c_clock_pulse(). The functions disable_i2c_pins() and enable_i2c_pins() were discarded as the i2c protocol spec. did not specify the need. Moreover bus recovered without it. (Tested on DM355 and Dm6446). Yes, I was referring to the davinci_cfg_reg() calls in {disable|enable}_i2c_pins() functions. Per the specification of the DaVinci devices, a pin needs to be muxed as 'GPIO' if it is to be used as GPIO controlled by GPIO module. It may have worked on couple of devices but cannot be guaranteed to work on all DaVinci devices (esp. DA8XX ones). I think that using davinci_cfg_reg() in generic_i2c_clock_pulse() is the wrong place to put it. This would require adding davinci_cfg_reg() for all know davinci platforms. The i2c recovery procedure is correct to assume that it owns the SCL line at that very moment. Instead I believe pinmuxing using davinci_cfg_reg(), should be done way early, just like we do for DM6446 in devices.c -- davinci_init_i2c(), for all other platforms. What I could do in function generic_i2c_clock_pulse() is, set SCL to output, and use gpio_request() by checking REVID2 register value (0x6) for DA8xx and 0x5 for others. But, the pins should remain as I2C pins till you actually hit a bus lock-up. That's when you need to convert them to GPIO pins and start the recovery by pulsing SCL. It you make them GPIO right at the start, they wont be usable as I2C pins for normal transfers? Right. I was also hoping to rid of cpu_is_xxx usage. The only other way I could think of is to add pinmux index into i2c platform data struct. What do you think is the best approach? Regards, Philby ___ Davinci-linux-open-source mailing list Davinci-linux-open-source@linux.davincidsp.com http://linux.davincidsp.com/mailman/listinfo/davinci-linux-open-source
RE: [PATCH 6/6] i2c: davinci: bus recovery procedure to clear the bus
On Tue, Feb 09, 2010 at 15:45:15, Philby John wrote: On 02/08/2010 09:33 PM, Nori, Sekhar wrote: On Mon, Feb 08, 2010 at 20:43:27, Philby John wrote: Hello Sekhar, On 02/08/2010 04:05 PM, Nori, Sekhar wrote: +static void generic_i2c_clock_pulse(unsigned int scl_pin) +{ + u16 i; + + if (scl_pin) { + /* Send high and low on the SCL line */ + for (i = 0; i 9; i++) { + gpio_set_value(scl_pin, 0); + udelay(20); + gpio_set_value(scl_pin, 1); + udelay(20); + } Before using the pins as GPIO, you would have to set the functionality of these pins as GPIO. You had this code in previous incarnations of this patch - not sure why it is dropped now. Don't seem to remember having the code in the old versions at least not in generic_i2c_clock_pulse(). The functions disable_i2c_pins() and enable_i2c_pins() were discarded as the i2c protocol spec. did not specify the need. Moreover bus recovered without it. (Tested on DM355 and Dm6446). Yes, I was referring to the davinci_cfg_reg() calls in {disable|enable}_i2c_pins() functions. Per the specification of the DaVinci devices, a pin needs to be muxed as 'GPIO' if it is to be used as GPIO controlled by GPIO module. It may have worked on couple of devices but cannot be guaranteed to work on all DaVinci devices (esp. DA8XX ones). I think that using davinci_cfg_reg() in generic_i2c_clock_pulse() is the wrong place to put it. This would require adding davinci_cfg_reg() for all know davinci platforms. The i2c recovery procedure is correct to assume that it owns the SCL line at that very moment. Instead I believe pinmuxing using davinci_cfg_reg(), should be done way early, just like we do for DM6446 in devices.c -- davinci_init_i2c(), for all other platforms. What I could do in function generic_i2c_clock_pulse() is, set SCL to output, and use gpio_request() by checking REVID2 register value (0x6) for DA8xx and 0x5 for others. But, the pins should remain as I2C pins till you actually hit a bus lock-up. That's when you need to convert them to GPIO pins and start the recovery by pulsing SCL. It you make them GPIO right at the start, they wont be usable as I2C pins for normal transfers? Right. I was also hoping to rid of cpu_is_xxx usage. The only other way I could think of is to add pinmux index into i2c platform data struct. What do you think is the best approach? I think passing pinmux index through platform data is fair. Thanks, Sekhar ___ Davinci-linux-open-source mailing list Davinci-linux-open-source@linux.davincidsp.com http://linux.davincidsp.com/mailman/listinfo/davinci-linux-open-source
How to integrate another decoder to my existing decoder?
Hi all, i am using DM6446. i used the video_copy example to implement my codec, and i do it. now i do another one based also on the video_copy example but i want now to integrate both of them. note:- currently i have 2 seperate folder structures each one contains the folders named (apps, buildutils, codecs, and servers) i want to have only one folder structure for the 2 codecs. Is there any document/URL/HELP/Suggestions to do this? or this is not possible to do this at all? I tryed below but it failed i modified ceapp_init(), that open, (Engine_open()), the codec engine and create, (VIDDEC_create()), two video decoders that attached to it. as below -- // reset, load, and start DSP Engine if ((ceHandle = Engine_open(engineName, NULL, NULL)) == NULL) { printf(CEapp- ERROR: can't open engine %s\n, engineName); goto init_end; } else printf(CEapp- Engine opened %s\n, engineName); // activate DSP trace collection thread TraceUtil_start(engineName); // allocate and initialize video decoder on the engine decHandle1 = VIDDEC_create(ceHandle, decoderName1, NULL); if (decHandle == NULL) { printf(CEapp- ERROR: can't open codec %s\n, decoderName); goto init_end; } decHandle2 = VIDDEC_create(ceHandle, decoderName2, NULL); if (decHandle == NULL) { printf(CEapp- ERROR: can't open codec %s\n, decoderName); goto init_end; } // success status = 0; -- also i created two functions named ceapp_decodeBuf1() and ceapp_decodeBuf2() each one call the corresponding decoder. that is all i do, am i right or this is totally wrong or there is still other modifications that should be done and i missed!! -- your help is highly appreciated thanks Mohamed AbdElwahed Ibrahim _ Your E-mail and More On-the-Go. Get Windows Live Hotmail Free. https://signup.live.com/signup.aspx?id=60969___ Davinci-linux-open-source mailing list Davinci-linux-open-source@linux.davincidsp.com http://linux.davincidsp.com/mailman/listinfo/davinci-linux-open-source
OMAP-L13x DDR Self-refresh bug
Hi, I have discovered a bug in the cpuidle code for the OMAP-L13x (da8xx) platform. In the state described as WFI and DDR Self-Refresh, the code is actually putting the DDR into the Power Down mode which is not the same as self-refresh. So the description string should read WFI and DDR Power Down. Putting the DDR into self-refresh and subsequently gating the clocks for greater power savings, requires significantly more work (Refer to Section 2.16 OMAP-L1x DDR2/mDDR Memory Controller User's Guide). cheers, James ___ Davinci-linux-open-source mailing list Davinci-linux-open-source@linux.davincidsp.com http://linux.davincidsp.com/mailman/listinfo/davinci-linux-open-source
RE: OMAP-L13x DDR Self-refresh bug
Hi James, On Tue, Feb 09, 2010 at 21:51:01, James Nuss wrote: Hi, I have discovered a bug in the cpuidle code for the OMAP-L13x (da8xx) platform. In the state described as WFI and DDR Self-Refresh, the code is actually putting the DDR into the Power Down mode which is not the same as self-refresh. So the description string should read WFI and DDR Power Down. Yes, going into power down on OMAP-L138 was intentional (it is supposed to save more power than self-refresh). I agree the state description could have been conditional on pdata-ddr2_pdown value. Putting the DDR into self-refresh and subsequently gating the clocks for greater power savings, requires significantly more work (Refer to Section 2.16 OMAP-L1x DDR2/mDDR Memory Controller User's Guide). This is implemented as part of suspend-to-RAM support. Have a look at arch/arm/mach-davinci/pm.c file. Thanks, Sekhar ___ Davinci-linux-open-source mailing list Davinci-linux-open-source@linux.davincidsp.com http://linux.davincidsp.com/mailman/listinfo/davinci-linux-open-source
RE: How to integrate another decoder to my existing decoder?
There is a good manual here, named Codec Engine Server Integrator User's Guide: http://focus.ti.com/lit/ug/sprued5b/sprued5b.pdf. It describes what you need to do to put multiple codecs in an application. You will need to create a Codec Server that contains both your algorithms (codecs), since you can load only one server at a time on the DSP. You say you tried but didn't state details about what you tried, nor details about it failing. You will get better support if you provide more detail, but hopefully the guide I pointed to will be enough to get you going in the right direction. Regards, - Rob From: davinci-linux-open-source-boun...@linux.davincidsp.com [mailto:davinci-linux-open-source-boun...@linux.davincidsp.com] On Behalf Of Mohamed AbdElwahed Sent: Tuesday, February 09, 2010 5:02 AM To: Davinci Mailing list Subject: How to integrate another decoder to my existing decoder? Hi all, i am using DM6446. i used the video_copy example to implement my codec, and i do it. now i do another one based also on the video_copy example but i want now to integrate both of them. note:- currently i have 2 seperate folder structures each one contains the folders named (apps, buildutils, codecs, and servers) i want to have only one folder structure for the 2 codecs. Is there any document/URL/HELP/Suggestions to do this? or this is not possible to do this at all? I tryed below but it failed i modified ceapp_init(), that open, (Engine_open()), the codec engine and create, (VIDDEC_create()), two video decoders that attached to it. as below -- // reset, load, and start DSP Engine if ((ceHandle = Engine_open(engineName, NULL, NULL)) == NULL) { printf(CEapp- ERROR: can't open engine %s\n, engineName); goto init_end; } else printf(CEapp- Engine opened %s\n, engineName); // activate DSP trace collection thread TraceUtil_start(engineName); // allocate and initialize video decoder on the engine decHandle1 = VIDDEC_create(ceHandle, decoderName1, NULL); if (decHandle == NULL) { printf(CEapp- ERROR: can't open codec %s\n, decoderName); goto init_end; } decHandle2 = VIDDEC_create(ceHandle, decoderName2, NULL); if (decHandle == NULL) { printf(CEapp- ERROR: can't open codec %s\n, decoderName); goto init_end; } // success status = 0; -- also i created two functions named ceapp_decodeBuf1() and ceapp_decodeBuf2() each one call the corresponding decoder. that is all i do, am i right or this is totally wrong or there is still other modifications that should be done and i missed!! -- your help is highly appreciated thanks Mohamed AbdElwahed Ibrahim [http://graphics.hotmail.com/i.p.emthup.gif] Your E-mail and More On-the-Go. Get Windows Live Hotmail Free. Sign up now.https://signup.live.com/signup.aspx?id=60969 ___ Davinci-linux-open-source mailing list Davinci-linux-open-source@linux.davincidsp.com http://linux.davincidsp.com/mailman/listinfo/davinci-linux-open-source