Re: Why is Time treated as an ugly child in labview
> You will find that in Labview, the squeaky wheel gets the grease! > Until recently, EVENTS in Labview were treated as ugly children, now, > the entire architecture of Labview is event driven. This change was > brought about only because of vocal popular demand. I believe more > input from users like you (and ME!) will eventually do for TIME what > we've done for EVENTS. While feedback is always welcome, just to be clear, I think stating that the entire architecture is now based upon events is a bit much. In fact, the marrying of events with dataflow was a bit tricky, and I'm pretty happy with how it turned out, but LV is still very much a dataflow environment. Events are there to help out with UI, and the other synchronization primitives that were already there are now supplemented by user events. User events are pretty much notifiers that feed into the event structure. As for time, in LV7.1 I believe the timestamp is now a 128 bit integer that measures on a much finer time resolution and for a much larger scale. This should work well for HW timed measurements. Of course the other time systems, including the ones the system and Excel use will still be with us. If you or anyone else has more concrete suggestions, please let us know about them, here or better yet in a suggestion form. Greg McKaskle
9-bit serial with VISA, LV7.1
Good Afternoon all, I've noticed that there has already been some dicussion about the improvement of certain aspects of LabVIEW VISA usage with the release of LabVIEW 7.1. Has anyone any feel for any impact in the response time, particularly in setting attributes (parity, etc.), either improvement, or negative impact. I'm interested between 7.1, 7.0 and 6.1. I, at my customer's request, have been working in 6.1, but they approached me Friday and said that at least one of their engineer's was going to try 7.1 as soon as they got the CD. My problems with doing 9-bit serial are _almost_ resolved, but the required parity switching between send and receive is on the hairy edge, timing wise, sometimes missing the first byte (the responding device's address). I have now gotten it to be about 98% reliable, but am looking to make it more like 99.99%. Also, is there any impact to running an "exe" version, or is what I'm looking at more a function of the Windows-VISA interface? Thanks, for the suggestions sent directly, and if KevinZ could write too me again, I lost your email address and would like to continue the thread we had started. Thanks again to all, happy wiring, Putnam Monroe Senior Engineer Summit Technology
Re: Limit in waveform data size?
Dear Carnivorus: I would venture to say that perhaps there is not enough memory to complete the operation! :) All snide aside, working with large arrays takes some serious number crunching. You might try twiddling your swap file size (on the Advanced properties tab of My Computer in win2K and later). However, you're probably bumping up against the absolute limit of the OS. If there is ANY way of dividing and conquering your data so as to not have to deal with such a large array, I think you should invistigate it. YOu may find you don't need anywhere near the samples you think you do. (voice of experience,here!) Or perhaps you don't need the dynamic range. (On most DAQ cards, you can't change the number of bits, but you might be able to use a card with fewer than you're currently using) Eric
Re: Why is Time treated as an ugly child in labview
Greetings: You will find that in Labview, the squeaky wheel gets the grease! Until recently, EVENTS in Labview were treated as ugly children, now, the entire architecture of Labview is event driven. This change was brought about only because of vocal popular demand. I believe more input from users like you (and ME!) will eventually do for TIME what we've done for EVENTS. Eric
Re: Upgrading LAbVIEW - all the time the same problem.
Hi Dennis As in most of the LV-related things, also here I think you are right. kind regards Pawel
Re: Upgrading LAbVIEW - all the time the same problem.
Hi Luca Thank you for the answer. I understand. I my situation I would prefer to have the latest version installed. I can live with some different directories, but 90% will be overwritten with the new version. So do you think I can overinstall the new version to the old one? What about the unsinstall information in system registry and all keys? Will they be overwritten, or stay there to the end of times? Prbably they will stay and only stuff the registry. If so, I guess that uninstalling first, and then instaling a new version is the right choice. Well it looks like I have answered my question as well :)) Thanks regards Pawel
Re: Executable crashes
Lewis, What about installing LV7.0 without the new imaq images format ? I think by default LV7.0 installation does not install the new image format. So may be LV7.0 with imaq 6.x might be a working conbination ? PJMLewis Drake <[EMAIL PROTECTED]> wrote: I have a large vision based (IMAQ) application that I developed as anexecutable for one of my customers using LV v6.1. The software uses theserial port to communicate with a microscope stage and has a PCI1409 framegrabber card. We are using Windows XP computers. The customer has a CDwith all the necessary software to create a working system, their "GoldenMaster". They have used the Golden Master many times to create new systems.Last week, they installed the software on a new PC (it ran fine) then gavethe PC to me to use in converting the development version of their softwareto LV v7.0. I installed LV, IMAQ, NI-IMAQ, etc. and did the conversion tov7.0. The only code changes required involved the different IMAQ imageformat. The software ran fine under the development version, and I wasquite pleased.Then, I started up the LV v6.1 executable that was originally installed bymy customer. The executable opened OK and put up a few images, but thencrashed and requested to send an error report to Microsoft. My customer'sLV v6.1 program is now inoperable on the new computer. Something waschanged when LV 7.0 was installed. I tried installing an earlier version ofNI-IMAQ with no success.Hoping to solve the problem, I created an executable in LV v7.0. It alsocrashed (i.e. LVRT.dll crashed) after behaving in the same manner as the LVv6.1 executable. However, the LV v7.0 development version of the programruns fine with no problems.My customer then reclaimed their computer and uninstalled all of the NIsoftware that I had installed. Next, they reinstalled the software fromtheir Golden Master CD and started up the LV v6.1 executable. It crashed asdescribed above. Now, that computer will not run their softwaresuccessfully. It probably would if the hard disk was reformatted andWindows was reinstalled, but we have not done that yet. We did try some ofthe special tools NI has for purging NI software from a system, etc.I tried running the LV v6.1 executable on my development computer (containsLV v7.0). It crashed as described above. Next, I created a LV v7.0executable on my development computer and it crashed as described above.So, I'm now in the position where I can not upgrade their software to LVv7.0 and create an executable. NI tech support has no clue what the problemis, nor do I. This software contains hundreds of VIs so troubleshootingwill be extremely difficult and time consuming. For now I'm forced to useLV v6.1 to make any software enhancements to my customer's code.Any ideas why this happened. I sure would like to see some debugging toolsmade available for troubleshooting a LV executable. If I could tell whichVI, or even which part of the LV code (e.g. IMAQ or VISA), was causing thecrash it would be very useful.Lewis DrakeProcess Automation CorporationBelle Mead, NJ908 359-1011www.processauto.com Do you Yahoo!?Win a $20,000 Career Makeover at Yahoo! HotJobs
Executable crashes
I have a large vision based (IMAQ) application that I developed as an executable for one of my customers using LV v6.1. The software uses the serial port to communicate with a microscope stage and has a PCI1409 frame grabber card. We are using Windows XP computers. The customer has a CD with all the necessary software to create a working system, their "Golden Master". They have used the Golden Master many times to create new systems. Last week, they installed the software on a new PC (it ran fine) then gave the PC to me to use in converting the development version of their software to LV v7.0. I installed LV, IMAQ, NI-IMAQ, etc. and did the conversion to v7.0. The only code changes required involved the different IMAQ image format. The software ran fine under the development version, and I was quite pleased. Then, I started up the LV v6.1 executable that was originally installed by my customer. The executable opened OK and put up a few images, but then crashed and requested to send an error report to Microsoft. My customer's LV v6.1 program is now inoperable on the new computer. Something was changed when LV 7.0 was installed. I tried installing an earlier version of NI-IMAQ with no success. Hoping to solve the problem, I created an executable in LV v7.0. It also crashed (i.e. LVRT.dll crashed) after behaving in the same manner as the LV v6.1 executable. However, the LV v7.0 development version of the program runs fine with no problems. My customer then reclaimed their computer and uninstalled all of the NI software that I had installed. Next, they reinstalled the software from their Golden Master CD and started up the LV v6.1 executable. It crashed as described above. Now, that computer will not run their software successfully. It probably would if the hard disk was reformatted and Windows was reinstalled, but we have not done that yet. We did try some of the special tools NI has for purging NI software from a system, etc. I tried running the LV v6.1 executable on my development computer (contains LV v7.0). It crashed as described above. Next, I created a LV v7.0 executable on my development computer and it crashed as described above. So, I'm now in the position where I can not upgrade their software to LV v7.0 and create an executable. NI tech support has no clue what the problem is, nor do I. This software contains hundreds of VIs so troubleshooting will be extremely difficult and time consuming. For now I'm forced to use LV v6.1 to make any software enhancements to my customer's code. Any ideas why this happened. I sure would like to see some debugging tools made available for troubleshooting a LV executable. If I could tell which VI, or even which part of the LV code (e.g. IMAQ or VISA), was causing the crash it would be very useful. Lewis Drake Process Automation Corporation Belle Mead, NJ 908 359-1011 www.processauto.com
Re: LabVIEW DSC 6.1 tag update slowdowns
At 11:04 PM -0400 5/7/04, David Hoadley wrote: Has anyone had trouble with slow updates of tag data in LabVIEW DSC v 6.1? Or better yet, have a resolution to the problem! We have a control system that consists of an Allen-Bradley PLC SLC5/05 connected via Ethernet to our HMI PC. Occaisionally (once a week or so), the control system bogs down to the point where it may take 60-90 seconds for a control actuation on the PC to get a response from the HW in the field. There are a few hundred tags set to update 5 times per second. Most of the time, the system is very responsive. When we get into this state, rebooting the PC resolves the problem. I cannot recall the version of RS Linx at this time. Turn off Virus scanning on your Citadel data directory. It's a known issue, though it may not be what's causing your particular problem. -- - Andrew Johnson - WireWorks West
Re: I need to disable the right-click feature of my mouse while running
> I want to disable the right-click of my mouse while my Labview app is > running. Anyone have a solution for this? For each panel you want this done for, open VI Properties and change the ring to Window Appearance. Hit customize and in the lower right is an option for runtime shortcut menus. Greg McKaskle
Re: Why is Time treated as an ugly child in labview
> What I mean to say is the use of time is inconsistant. Trying to put > time on the x-axis of a chart is terrifying. XY chart is pretty easy > and intuitive. Measurement Studio and VB is intuitive. > > The realtime.vi is a help but what happens when your data is coming in > at irregular intervals. (40sec to 1hour++) > If you send waveforms to a chart, each waveform has a t0, and the waveform can have one point or 1,000,000. So to get a variable time chart that isn't completely XY -- won't do circles -- just send a waveform with one point to the chart. This means every point is timestamped and you probably will get what you want. I believe this is what happens by default when using single point DAQ. Greg McKaskle
Re: Why can't I save any VIs?
> I can't save any VIs. When trying to save a VI, LabVIEW crashes and > gives the following error message: > Error #3:"memory.cpp", line 996 LabVIEW version 6.1 > > > This a new installation of LabVIEW 6.1 on a WIndows NT operating > system. What has happened prior to the save? Error 3 indicates that someone is calling a memory manager function with a bad value, possibly from another memory zone, but possibly corrupted. Here is a test. Launch LV, open an empty VI, and Save it somewhere. If this doesn't work, you have a bad installation of LV. I don't know how it got that way, but you might want to uninstall, check the file system using chkdsk or some other disk utility, repairing any bad tracks, etc. Then reinstall. More likely, the empty VI is fine, and you have a particular VI that after being executed will not save. This probably means that VI or one of its subVIs is corrupting memory. The primary suspect in these cases is a DLL or CIN that is incorrectly configured or passed parameters it can't deal with. If you have more than one, narrow it down by putting case statements around some of them or running only partway through the VI and aborting. Then unload the VI to check for corruption. Post again if this is the case and it isn't obvious why the DLL call is corrupting memory. Greg McKaskle
Re: How do I display a progress bar which obtains its value from
The reason I chose this way (other than not knowing another way) was that I wanted to increase the value in the progress bar for 1 case statement and decrease it for another. I have a while loop with a shift register that is used to increment the progress bar in 1 case, and in the other I just subtract 1 from the variable and then feed it back to the variable. It only increments and decrements for 1 iteration as it enters the case, (the button is pressed that sent me to the case). I am now trying to figure out how to make the iterations continuous while the button is held down. The 'button' controls a motor on a robot arm and while it is held down, the motor drives. I want to represent the arm segment position with the progress bar (smoothly and continuously). I'll look into the reference value for the control. Thanks for the suggestion. Greg McKaskle wrote: >> I'm trying to do the same thing with a progress bar. The closest I >> have come to increase/decrease control of the bar is to use a >> variable. Right click on the variable name and select the desired >> indicator. Feed the variable the data and the progress bar will >> follow. The indicator does not need to be wired to anything and can >> be placed anywhere. >> > > If the terminal isn't used, but a local of the same direction is, then > you should delete the local and use the terminal. The terminal is the > primary, built-in access to the data of the control. Use it over > locals. Also try to use a minimum of locals as more locals open up the > possibility of more race conditions. > > To update a progress bar on another panel, use a control reference's > value property. > > Greg McKaskle >
Re: Type issues in LV
> How come NI did not address this issue yet?! They got to have a > universal "GetTypeofVariant" thing somehwere there, I am pretty sure. > In plenty of cases, variants can only have a few types. It is pretty easy to cast them to the various types to see which succeeds and which fails. In those cases where the variant can truly be anything, then there is no simple way of saying here it the type except to give the type descriptor. The node that does that is the Variant to Flattened String, right next to the Variant to Data. Just ignore the flattened string part and take the array of I16s that encode LV types. As mentioned, there are utilities for parsing this type, but this is quite low level and difficult. On the otherhand, it is a binary encoding of all possible types LV can represent, it has to be low level and complex. If you have more questions about getting something done, please ask, I'll stop making excuses for how hard it is now. Greg McKaskle
Re: MessageBox/MsgBox for LabVIEW
The message box is excellent! I really needed it. Thanks, Prasun Mital
Re: memory goes crazy when calling a sub-vi
> Well for Greg it is easy. He is one of the core LabVIEW developers and > although he doesn't know every detail from his head, he can go and look > in the sources, if he needs to. > > For me it is just 12 years of working with LabVIEW and having read a lot > on Info-LabVIEW, as well as having done my own extensive low level C > programming. Besides that, there has been a time when I had some spare > time left to dig rather deep into LabVIEW and experiment with it on many > levels to find out some of its internal workings. Easy huh? I don't know about that. You will pick it up too. You learn from trying things, making mistakes, and asking questions. LV has a simple syntax so that you can quickly get started, but as with any good language, there are many different ways to do things. This means that you can suite the solution to the problem. Discovering when to apply which solution to what problem is quicker if you are an experienced programmer in any other language. If not, make lots of small controlled experiments and ask questions. Greg McKaskle
Re: How do I display a progress bar which obtains its value from
> I'm trying to do the same thing with a progress bar. The closest I have > come to increase/decrease control of the bar is to use a variable. Right > click on the variable name and select the desired indicator. Feed the > variable the data and the progress bar will follow. The indicator does > not need to be wired to anything and can be placed anywhere. > If the terminal isn't used, but a local of the same direction is, then you should delete the local and use the terminal. The terminal is the primary, built-in access to the data of the control. Use it over locals. Also try to use a minimum of locals as more locals open up the possibility of more race conditions. To update a progress bar on another panel, use a control reference's value property. Greg McKaskle
Re: I need to disable the right-click feature of my mouse while running my LV app
In the VI Properties (right click on the icon and select VI Properties . .), on the Catagories dropdown list, select Window Appearance. Choose Customize then un-check the option Allow run-time shortcut menus. That should do it for you. Tim
Re: I need to disable the right-click feature of my mouse while running my LV app
See answer to identical post http://exchange.ni.com/servlet/ProcessRequest?RHIVEID=101&RPAGEID=135&HFORCEKWTID=99128:5&HOID=5065000800FED2&HExpertOnly=&UCATEGORY_0=_49_%24_6_&UCATEGORY_S=0";>here
NO SCXI+DAQmx was: Software bug with NI-DAQmx 7.1, 7.2 & SCXI > 1125-SCXI 1313
Neville, I just tried to do an SCXI strain gauge project with DAQmx and the most recent drivers and also had horrible experiences. I don't think the addressing of SCXI modules and channels is right. After my customer and NI didn't have any luck at it, I had to make an emergency plane trip to fix the system and ended up reverting to traditional DAQ. NI was supposed to be following up on this as of April 27th but I haven't heard any reports on what they've found. Sorry for not posting this earlier but... DON'T USE DAQmx with SCXI! --David Moore P.S. We also found that strain gauge excitation worked under DAQmx for Windows, but broke when we switched to DAQmx for LabVIEW RT. I was able to rewrite the app as non-RT but obviously should have ditched DAQmx at that point. > >Subject: Software bug with NI-DAQmx 7.1, 7.2 & SCXI >1125-SCXI 1313 >From: "Neville Dholoo" <[EMAIL PROTECTED]>>Date: Fri, 7 May 2004 07:09:59 -0700 > >Hi Guys, > >Just spent 2 days isolating a very specific problem that >I had when >using the SCXI 1125 module with the SCXI 1313 breakout >board. > >I set up an analog input virtual channel in old style DAQ >(7.1 and the >latest version 7.2) and the corresponding global channel >in DAQmx. > >The old style daq channel worked fine, but the DAQmx >channel just read >garbage. > >Repeated support calls (yes we signed up for the >subscription!) didn't >produce much help from NI. > >I finally narrowed it down to the specific channel range >I was choosing >(+/-15V). The old style daq allows you to specify the >individual gain >(on the specific SCXI 1125 channel) and attenuation >(provided by the >programmable breakout board). >DAQmx does not. It automatically sets both these >parameters based on >signal input range. However, it looked like it set up >the attenuation >correctly, but the gain setting was probably wrong. > > >Work-around: I had to use a daq channel on the PCI daq >card instead, >with a potential divider to bring the signal voltage down >to +/-10V. >Changing the code back to old-style daq was not an >option, this late in >the game. > >Hope this saves someone else some time. > > >Cheers, > > >Neville Dholoo >Cellex Power Products, Inc. >Richmond BC >Canada > >--