problem with LV 7.1 and LeCroy IVI drivers
Just a quick note to anyone who might be using LeCroy IVI drivers (at least for the Wave Runner 6100 oScope) with LabVIEW. It appears that if you try to recompile them for LabVIEW 7.1, they won't work. It shows up as an "IVI Error Conversion" VI that won't run (broken arrow). Not only will it not run, but it won't give you any information as to why it isn't running. (the first time I've ever seen this) LeCroy has been made aware of this problem and is troubleshooting/fixing it. (In the mean time, I bought a Tektronix instead. Needless to say - no problems with it.) John
another Charlottesville job posting
Just thought I'd let you all know that Sperry Marine (a Northrop Grumman Company) is looking for Test Engineers here in Charlottesville, VA. There are several test engineering positions open. Most (but not all) require LabVIEW experience. One will involve mostly ATE type work (functional testing on a VXI based test station). Another will require some familiarity with industrial automation / machine control. Please email me if interested. Regards, John Howard, Test Engineer Sperry Marine, a Northrop Grumman Company 1070 Seminole Trail Charlottesville, VA 22901 (434) 974-2296
Development Suite update
I just got the update in the mail for the Developer Suite. I'm sure many of you will be pleasantly surprised to find out that ALL Professional versions of the various Developer Suits now include the following: LabVIEW State Diagram Toolkit 1.0 LabVIEW VI Analyzer Toolkit 1.0 LabVIEW Express VI Development Toolkit 1.0 This is great since these aren't products I'd be willing to pay $995 each, but when they are thrown in for 'free', I might actually try them out. (Maybe even upgrade some of our Standard licenses to Professional to get these additional tools.) John H.
Re: How to talk my company into allowing OpenG usage?
One other detail to think about. Are you intending to use this code for internal use only? If so, then why should anyone care how you get your code? I work for a large defense contractor as well, and no one seems to mind as long as I get the job done and actually have all the source. (although perhaps mine is a case of "It's easier to ask forgiveness than get permission." ;) If you are using it for a deliverable product, then it will ultimately come down to the attitude of the people who make these decisions. Rather than looking for a way to accomplish something, they will often look for all the reasons it CAN'T be accomplished. Too often, the mentality at large defense contractors is to 'er on the side of caution' - usually to the point of paranoia. And heaven forbid you should ever want to do something different than the way it was done before! (I could go on) Anyway, good luck in your effort! John Howard Test Engineer "A Large Defense Contractor" >>> Jim Kring 04/14/04 08:11PM >>> > You get what you pay for; if it's cheap or free it's crap. And some opinions fit into that category too, but it is best not to generalize ;-) One should always perform an evaluation of the tools (and advice) they are going to use, prior to using them. If they meet your standards and requirements then go for it. If not, you can always throw money at your problems (and pay someone to invent you a wheel). Largely, the price you pay for Open Source software is the time that it takes you to evaluate, learn how to use, and participate in the development of the software -- in that respect Open Source software is not free. But if you don't pay that price, then... (sometimes) you get what you pay for. Regards, -Jim PS - Have you read your NI Software License Agreement lately? Is your application authorized and will it be authorized tomorrow? OK, gotta run... I hear the helicopters circling :-)) Craig Graham wrote: >Swinarsky, DJ Derrick (5453) @ IS wrote: > > >>Just curious if anyone has any tips on convincing the corporate types >>that using software developed by the opensource community is a GOOD >>thing. I would love to be allowed to use all the OpenG tools and >>think that many in my group would also benefit from them. Also would >>like to use some of the labXML code developed on sourceforge. >> >>Problem is that I work for a big defense contractor that is not too >>keen about the idea of opensource. Have some paranoid individuals >>somewhere in the chain that think code developed on sourceforge is >>not reliable and shouldn't be trusted. Also have heard that in order >>for anyone in the company to use software downloaded from the >>internet the company has to have an agreement with the providing >>entity (company usually). >> >>Any help in this matter would be GREATLY appreciated!!! >> >> > > >I went down the same path about a year ago and gave up. Problems I >encountered (and still do in different contexts); > >1) You get what you pay for; if it's cheap or free it's crap. >2) If it wasn't written in-house, it's crap. >3) If the license doesn't state in one line of words of one syllable that >there are no copyright, royalty or any other IP issues with the code no >matter how you use it, it's crap. > >I disagree with all three points, but that's what I came up against and I've >given up on it. With regards to point 3, I got quite an involved email from >Jim Kring explaining what we could and could not do with OpenG. Didn't help- >the response was that an email isn't legally binding. In another instance, >Albert Geven offered to let me use some pretty useful code he'd done for >saving and loading panel settings in a way that was resistant to datatype >changes, control addition/removal etc. The conditions that were imposed on >using that were unreasonable- I was expected to ask Albert to sign a waiver >that no IP rights would be exercised on the code and fax it back to us so we >could use the code he'd given me as a favour. I didn't see why he should >have to mess about like that so I didn't do it. When someone sends you code >in an email with permission to do with it as you please, it's a bit off to >then start asking them to jump through hoops so they can help you out! > >Even though you have the source and so the included open-source stuff can be >audited (and maintained) just as well as the in-house stuff, I wouldn't hold >out hope on persuading the decision makers at your end. > ><\rant> > > >
labview job
I am posting this on behalf of a friend. There is a job available here in Charlottesville Virginia for a test engineering position at a company that makes pharmaceutical equipment. Basically they need someone NOW. The candidate would NEED test engineering experience as well as be a competent programmer (preferably LabVIEW). Contact me via private email for more details. John Howard [EMAIL PROTECTED]
Re: Standardized data
Sorry for the late response. I just discovered another option for standardized data in test applications. ATML is just XML with some standard formats for ATE applications. Some more work is needed, but you can take a look at what is available at http://www.atml.org/. (you will need to join to get access to the stylesheets etc, but it is free - no strings) I have never used ATML, so can't say wether it is really useful. Just thought some people may be interested in taking a look. John >>> Kilgour David 03/09/04 04:26AM >>> George, Try looking at some of the xml based formats that are described at www.xml.org There are schemas from a large number of different user groups. David __ David Kilgour Detection Department DSTL Fort Halstead Sevenoaks Kent TN14 7BP Tel: 01959 892687 Fax: 01959 892506 __ "This e-mail is intended for the recipient only. If you are not the intended recipient you must not use, disclose, distribute, copy, print, or rely upon this e-mail. If an addressing or transmission error has misdirected this e-mail, please notify the author by replying to this e-mail." "Recipients should note that all e-mail traffic on MOD systems is subject to monitoring and auditing."
Re: Confounded and Ranting
I wonder if the following would be useful. How about a tool using a tree view control, which would present a hierarchical overview of a LabVIEW application in the left pane of a window, and detailed VI/Library information in the right pane. One of the details presented could be where the VI is loaded from. I could even see something along these lines becoming the main user interface for a LabVIEW project management tool. There are probably a lot of feature that could be included in such a tool that would be very handy to people working on multiple projects. Perhaps you could define project properties such as automatic prefixes/suffixes for all VI's in a project. Integration into a configuration control engine could be included. VI's actually loaded in memory could be highlighted somehow. I could keep going, but I think you get the idea. Would something along these lines be beneficial to a lot of developers or just a few crazy ones such as myself? :) (Needless to say, what I am picturing is not original. It kind of resembles the GUI for many text based IDEs such as Visual Studio.) John Howard >>> George Gatling (Contractor) 03/01/04 11:05AM >>> Thanks for the great feedback here! It seems that at some point a save operation failed and caused labview to write to the temp folder. I rarely save things outside of the normal folders and never to something buried and hidden in Documents and Settings. Also the hard drive is not even close to full... some 10% full with 100GB remaining. But many other things could cause a file write to fail. Then I suspect that I got presented with a search dialog starting in the temp folder (although I did not notice this at the time). After i had chose the first VI there, the garbage path was now in /* and so it likely started pulling sub-vis from both the main and the temp depending on which parent was trying to load the file into memory. This would explain the seemingly random pattern of temp vs main files. The end of the story is it took about an hour to sort out, but I did eventually fix it. >I think this is a very important topic, and if you have further >questions or gripes, it is a good time to cover them. Might even have >something to do with what I'm working on. > I agree that this is an important topic, but the idea of further questions or gripes is mindbendingly open ended. I can say on a very high level that I am not satisfied with the way labview manages files and filenames. It is much harder to say what would make me happy. But I will let it percolate for a few days and see what revelations come to me :)
Re: Queues and strict typedefs
LV2 style globals will update with the type def. John Howard >>> Craig Graham 02/25/04 07:09AM >>> When I have a strict typedef as the datatype of a queue, and I hold the reference to the queue in a global, changes to the typedef don't propagate to the global reference. Is there a way to make it happen? A different way of creating the reference for example? I'm presently using the context menu "Create Constant" and cut&pasting to the global. -- Dr. Craig Graham, Software Engineer Advanced Analysis and Integration Limited, UK. http://www.aail.co.uk/
RE: Good Programming Practice?
I hope no one minds me bringing up LCOD again - it just seems like a really good solution here. :) LCOD (LabVIEW Component Oriented Design as explained in the book "A Software Engineering Approach to LabVIEW") provides a very elegant solution to this problem. Using LCOD one would encapsulate all user interface updates as messages in a component similar to a LV2 style global. This component can then be called ANYWHERE from ANY vi, without using ANY control references! You end up with a nice 'sterile' main vi, without the control reference overhead. John Howard >>> George Gatling (Contractor) 02/24/04 10:51AM >>> I am pretty surprised to hear so many people clamoring to keep the top level VI laden with code. In nearly all of the other languages (admittedly those crude text based ones) the convention is to keep main() or its equivalent relatively codeless, except to start the necessary objects or loops. Let me be the first to say the analogy between the top level vi and main is not wholly accurate, but they are certainly similar. But from some simple benchmarks, it is apparent that moving indicator updates out of their "native" diagram is a serious performance hit. This, IMHO, is absurd and should not be. Now, performance aside, why is it so critical to have all of the panel update contained in one potentially mammoth vi? It is no big deal if there are only a few indicators. But when you start really piling them on, managing them becomes a pretty big deal. Now it might not be such a hot idea to scatter front panel writes throughout the application, but I don't think it is a great idea to confine them only to their diagram either, especially when that diagram is the top level. The obvious, to me, compromise it to keep some sort of queue driven UI update, but use references to move that into a sub-vi. Now, since it is by reference, the updates would not have to all be in this singe sub-vi either. Instead, they could be scattered all over the UI update hierarchy. If you are married to the idea of keeping every update in one diagram our goals are intrinsically incompatible. I personally find such an approach impractical, and that is what started this idea in the first place. But I can see how the other end of the spectrum is fraught with confusion. If an update can happen anywhere, it will soon happen everywhere. So it would seem like the idea of globally available references to the top level indicators is probably NOT good programming practice, but I am still latched on to the idea of moving the updates into their own sub-hierarchy and out of the top level. This will incur the by reference performance hit, but for now that is not a big deal for me as I only need about 0.5 updates/sec. And when the time comes that I need better performance I expect NI will have sped up the reference operations ;) Okay... some simple benchmarks. numbers are updates/sec for an xy graph of 2000 points (random numbers) 1139 - no update, calculation only 942 - direct wire 132 - by value (directly linked property node) 128 - by reference, same panel 128 - by reference, sub-vi 125 - by value, signaling (directly linked property node) George >Hmm, why is this a mess? It makes perfect sense to me. The only access >to the front panel via remote VI's is through a well defined interface. >You setup your messages and your data types and the update stays in the >main menu as it should. If a VI needs to update the front panel then it >posts a message on the queue with a specific update command. Queues have >been around for a while and it has been proven by many on this list to >be a very efficient method of communication between parallel processes. >I would re-think your comment on a queued architecture. > >Using control references breaks out of the object oriented methodology. >Manipulating VI panels directly via other modules means I need to know >everything about the Main VI for me to update the panel. In many cases, >updating the front panel involves more than just dumping data to a >control. It may involve processing the data and possibly changing >several parameters of the Main VI. It is best to let the Main VI do what >it does and knows how to do. Using the control reference approach means >I now have my Main VI scattered in 100 different places. Where do I go >to fix a bug? Also, can two programmers work on the same code at the >same time? > >To comment on the use or overuse of references, I would have to say that >on the totem-pole of efficiency, they rank on the bottom of the list as >far as update speed. The order of efficiency would be (correct me if I'm >wrong): > >1-Wire directly to terminal (ah yes, the good ol' days) >2-Wire to a local variable >3-Wire to a "value" property node on the same diagram as the control. >4-Wire to a "value" property node via a cotrol reference > >The reasoning behind what approach to take depends on the situation of >course. One thing to con
Re: Windows XP Time Set Feature
This XP time setting feature can be disabled. In the control panel, open "Administrative Tools", then "Services". Find and open the "Windows Time" service. Here you can stop it (temporary) and disable it (permanent). Another option is to use the "Network Time Protocol" service as a replacement to the "Windows Time" service. It is a free download that requires some confusing setup to get working properly, but it seems to do a much better job at keeping all the networked computers accurately synchronized without making any big changes to the computer time. (it slowly shifts the time to keep synchronization rather than making periodic 'step' changes) It also seems far more accurate (should easily keep microsecond synchronization). Hope this is useful! John Howard, Test Engineer Sperry Marine, A Northrop Grumman Company 1070 Seminole Trail Charlottesville, VA 22901 (434) 974-2296 >>> Lewis Drake 02/19/04 08:51AM >>> Our SCADA Tool Kit has a feature that continually monitors the computer time clock to detect any gross changes. This is done because the data compression and history logging programs were written to work with monotonic timestamps. We started to see unexpected system time advance/retard changes causing problems in the SCADA Tool Kit on our XP based installations. The problem was caused by a new XP feature that periodically updates the system time clock automatically at scheduled times. If you open the Windows XP Time and Date Properties (right click on the computer's time display) you will see a new tab there called "Internet Time". You may adjust the time update schedule or disable updating there. I'm posting this to alert you to a potential to be a problem if you are writing any LabVIEW program that assumes monotonic timestamps. Lewis Drake Process Automation Corporation Belle Mead, NJ 908 359-1011 www.processauto.com
re:text selection - text color bug?
Right - of course. There I go spouting off again without thinking first. Like you say, an append method might make what we desire possible. I still think we should be able to accomplish something similar using tables and arrays though. John Howard >>> Michael Sachs 02/18/04 12:49AM >>> Problem solved. Thank you Jean-Pierre Drolet for you observation that I am rewriting the whole string each time, for some reason I did not see that even though it is painfully obvious. I think it would be nice to have an append method for the string control.
Re: text selection - text color bug?
It may not be considered a bug, but it is a royal pain in the you know what! It essentially makes the ability to add color highlighting to changing text too difficult to be worth the effort in most cases. (in my opinion) This brings up another problem I've had. Trying to color text in different cells of a table. Am I missing something, or is this basically impossible to do with LabVIEW? Back to the string indicator coloring problem again, I wonder how one would go about making a set of VI's that would simplify this type of detail? I'll have to think about this a little, there may be a way. H... John Howard >>> Michael Sachs 02/17/04 09:18AM >>> Hello All, I am using the Text.SelStart, Text.SelEnd, Text.FontColor properties to color incoming rs232 strings as they are displayed in a text box. (I am snooping 2 ports and I need each port to have its own displayed color.) The problem is that when you add new text to the display it reverts all the displayed text to the starting color of the first text font. Is this a bug or some strange feature that I do not understand? Thanks
RE: low level device drivers
Somehow I thought you might have some good advice Rolf. :) Since there are a lot of functions I need to implement, I'll be looking for a simpler alternative. (Either using another companies hardware or getting one of our C++ programmers to write a DLL.) You always seem to have the answers whenever a DLL implementation question pops up. I'm sure I can speak for many other info-LVers in thanking you for your regular contributions! Cheers! John Howard >>> Rolf Kalbermatter 02/14/04 07:54AM >>> "Howard, John" <[EMAIL PROTECTED]> wrote: >I am considering using a PCI card (Sabtech NTDS interface card for those >interested), and it doesn't seem to have a nice simple DLL like I am use to. >Instead it has a kernel mode Windows 2000 device driver, and I am expected to >use "...normal Windows I/O system calls..." to talk to it. The driver is >written to be part of the Serial Port driver class, BUT it can not be treated >simply as a serial port since "...there are several restrictions on the I/O >calls...", and "...it has its own set of DeviceIoControl functions." > >I know so little about Windows device drivers that I don't really even >understand the paragraph I just wrote. Does this mean I can't talk to the >card directly from LabVIEW? If I can, is it difficult for someone who uses >DLL's fairly often? Can I use the Win32 API to talk to it? Does anyone have >any suggestions on how to approach this? Well, you load a device driver by calling CreateFile() with the device driver name (usually something like .\\MyDevice). Newer device drivers do not use a hardcoded device name but a class GUID which you have to translate into the appropriate device name through the setupapi functions before loading the driver. After that you basically call DeviceIOControl() and eventually ReadFile() and WriteFile() on the handle returned from CreateFile(). At the end you close the handle with the CloseHandle() function. It is not impossible to call all this functions directly from LabVIEW with the Call Library Node but especially the DeviceIOControl() gets nasty very fast. So it is possible but unless you only have to implement one or two simple DeviceIOControl() calls it would be probably better to implement the interface in a DLL with a more LabVIEW friendly call interface. Rolf Kalbermatter CIT Engineering Nederland BVtel: +31 (070) 415 9190 Treubstraat 7H fax: +31 (070) 415 9191 2288 EG Rijswijk http://www.citengineering.com Netherlands mailto:[EMAIL PROTECTED]
low level device drivers
OK, I think I am in over my head. :) I am considering using a PCI card (Sabtech NTDS interface card for those interested), and it doesn't seem to have a nice simple DLL like I am use to. Instead it has a kernel mode Windows 2000 device driver, and I am expected to use "...normal Windows I/O system calls..." to talk to it. The driver is written to be part of the Serial Port driver class, BUT it can not be treated simply as a serial port since "...there are several restrictions on the I/O calls...", and "...it has its own set of DeviceIoControl functions." I know so little about Windows device drivers that I don't really even understand the paragraph I just wrote. Does this mean I can't talk to the card directly from LabVIEW? If I can, is it difficult for someone who uses DLL's fairly often? Can I use the Win32 API to talk to it? Does anyone have any suggestions on how to approach this? There is another card I am considering from a different company that does have a DLL (with a rather obscure command set), which I could use instead of this one. Or maybe I could get one of our .NET programers to write me something simple. Suggestions anyone? Thanks for any nuggets of wisdom! John Howard
RE: Dual monitors for labview use
Scott: - except for intense 3D games, the bus configuration (PCI vs AGP) shouldn't make a huge difference. - I think your performance will be more affected by which card you choose than by the bus configuration (personally, I do like Matrox). - unless you have nothing better to do with your PCI bus, I'd avoid 'wasting' PCI bandwidth on a video card. (there is "only" 133MB/s to play with). Just my thoughts. I'm certainly no expert. John Howard >>> Scott Serlin 02/13/04 11:34AM >>> What is the difference in performance I could see between running two separate pci cards, one pci card and one agp card, or one dual-monitor card? -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] Behalf Of Simon Whitaker Sent: Friday, February 13, 2004 2:59 AM To: [EMAIL PROTECTED] Subject: Re: Dual monitors for labview use Hi Scott, > Can anyone tell me what graphic cards work well with respect to using > dual monitors and labview? How does the graphic card handle moving the > display data from one monitor to another. I was considering the ATI > Radeon 9800XT. Great for gaming as well. I remember someone talking > about having the diagram window on one machine and the panel window on > another. No more fussing around with the windows to do your work > faster. We use dual monitors here, using a variety of different graphics cards. I have an nVidia GeForce FX5200 (great card, but take care if you buy one because they do single- and dual-monitor varieties), my boss has a Matrox card of some description, and one of our machines just uses two separate single-display cards. That machine runs XP, which has dual-monitor support built in, so you can use two separate single-display cards instead of a single dual-display card. In all cases, LabVIEW works just fine with them. The software we develop uses a subVI to retain the screen position of VIs and put them back in the same place next time they're launched (using the VI's Windows.Bounds property), and that has no problem coping with the dual display setup. As someone else has noted, XP and Windows 2000 treat dual monitors as one large desktop, so software doesn't need to be aware of which monitor it's running on, it just runs somewhere on that large desktop and neither knows nor cares whether that happens to be on monitor 1 or monitor 2. The only noticable difference between the various solutions we use is the facilities offered by the driver software that comes with the video card(s). My nVidia card comes with an app called nView that allows you to do all sorts of fancy things with your dual monitors, such as enabling window spanning across desktops, choosing which monitor specific apps open on, setting different wallpapers for each monitor and so on. The machine that uses two separate cards and relies on the dual monitor support built into XP lacks these advanced features, although it's still perfectly usable. If you have an XP machine with a spare PCI slot, a spare PCI graphics card and a spare monitor, try plugging the card into the PC and giving it a go - won't cost you anything, and you'll see it first-hand. Try before you buy! :-) > One other off topic question. Have you guys/gals been buying > and using LCD monitors? Anyone running them for a long time? I keep > seeing used monitors show up on ebay that have a few/many lcds burnt out > or broken in the display. How hard and expensive are they to repair if > this is the case? I can solder as I'm a EE by nature. I've got an LCD monitor as my 2nd display - a fairly cheap 15" (1024x768) model that I have pivoted through 90 degrees (another feature supported by the nVidia software), making it great for working on documents. I've had it for about 4 months and haven't had any problems so far with dead pixels. We've got another LCD display that we've had for about a year, again no problems with dead pixels. Not sure how feasible it is to repair a dead pixel. Note that even new LCD monitors may ship with some dead pixels, and most manufacturers specify a tolerance level for dead pixels below which they won't replace a panel. All the best, Simon Whitaker <[EMAIL PROTECTED]> Software developer, Tiab Ltd tel: +44 (0)1295 714046 fax: +44 (0)1295 712334 web: http://www.tiab.co.uk/
Re: Dual monitors for labview use
I seem to have more limitations imposed on me by the monitor than the video card. According to the G550 specs, it has "dual integrated RAMDACs", a 360MHz Primary and a 230 MHZ Secondary. It claims up to [EMAIL PROTECTED] on the main display, but my monitor can't go that fast. Likewise, the secondary should be able to go faster than 85Hz @1024x768, but once again the monitor can't. So maybe your own limitations aren't the video card? Just something to consider. John Howard >>> Uwe Frenz 02/13/04 10:37AM >>> John, you wrote: > I am using a Matrox G550, and it > works wonderfully! (21",[EMAIL PROTECTED] + 17", [EMAIL PROTECTED]) Thats about or little above the specs, AFAIK. The highres requires a pixel clock of >160 MHz. The video bandwidth of that chips is not far beyond 250 MHz, so the signal quality will not be very impressive. But it works - i use exactly the same resolution on a Nokia 445Xpro. But the second head used to have a much smaller video bandwidth. The producers tend to overstate their product capabilities, e.g. allowing resolutions that do not produce good signal quality - just to have impressive marketing labels. -- Mit freundlichen Grüßen! Greetings from Germany! -- Uwe Frenz ~ Dr. Uwe Frenz Entwicklung getemed Medizin- und Informationtechnik AG Oderstr. 59 D-14513 Teltow Tel. +49 3328 39 42 0 Fax +49 3328 39 42 99 [EMAIL PROTECTED] www.Getemed.de
Re: Dual monitors for labview use
>>> Uwe Frenz 02/13/04 02:49AM >>> ...but almost all dual monitor cards should work well too. ... This may be true in general, although I saw one dual monitor card do nasty things. The brand name was something like Apien or Apex or something like that. (does that sound familiar to anyone?) It was unbelievable how badly this card performed. Certain everyday tasks would only work on one monitor, and there were other problems I forget now. I am using a Matrox G550, and it works wonderfully! (21",[EMAIL PROTECTED] + 17", [EMAIL PROTECTED]) John Howard
Re: DAQ OPC on Windows XP
If any of you think you might use DAQ OPC on Windows XP computers, this email might save you a whole lot of headache. Those who have used the OPC server for NI DAQ, have probably had to mess with DCOM security settings in order to get it to work. NI has some of this documented on their web site. However - there is one setting you have to change for Windows XP that they don't mention. Apparently, the default setup for XP forces remote users to authenticate as Guest. (there must be a logical reason Microsoft changed this - but it is beyond me) This means that DCOM clients (OPC) can not connect to a server on an XP machine unless the Guest account is activated AND has sufficient rights to launch the server. Here is how you change this setting: Go to "Administrative Tools" > "Local Security Policy" > "Local Policies" > "Security Options" > "Network access: Sharing and security model for local accounts" You need to change this setting to "Classic - users authenticate as themselves". After doing this you won't need to activate the Guest account. It has taken me WEEKS to discover this! I am really ticked of that this wasn't documented somewhere for those of us who aren't experts with Windows XP networking. Now hopefully I can finally get down to writing some software! (with my luck - one of you will now find this documented in a user manual or somewhere silly like that :) Now to inform the NI support engineer who was just as confused as myself. John Howard
RE: State machine diagram editor
A few more state machine options to add: - use an array to manage the task list as Uwe mentioned - use a queue to do the same thing, but with a little different 'flavor' - a slightly 'odd ball' implementation I have experimented with is with user defined events (using the state machine's type def enum as my data type). This allows control of the state machine from anywhere in your program - including the various parallel loops. Just in case we didn't have enough options already. :) John Howard >>> Uwe Frenz 01/19/04 03:36AM >>> Scott and all, I am a bit late here, but... "Scott Serlin" <[EMAIL PROTECTED]> wrote on Fri, 16 Jan 2004 11:17:53 -0600 ... > Also, does anyone have any state machine > tips that they can share? I typically use a string driven state > machine. I then call out each state from other states. I also use a > "nextstate" local variable so that I can reuse particular states in the > machine over and over again and not have to replicate the state > throughout the machine. Let me and the email list know your tips. ... My last devellopment of a rather 'big' (e.g. complicated) state machine is some month ago. So my memoried details are fading, but one point I remember being very important was that I used an 'array of steps/states' instead of just 'nextstate'. This was very importand in cases where (in my case transmission) errors my damage data in a given state AND there's a way to recall that data within a given time frame (e.g. to recover from specific errors). Whenever the data receiving state detected an error it switched to the error recover state by putting that error state in front of the array of steps. And a finished step is removed from the array whenever it is really finished, e.g. afterwards. This way one could also 'script' such a state machine by calling it with more than one element in the step array. Just what I remember. At the moment I can not access that project code to recall more details. Greetings from Germany! -- Uwe Frenz ~ Dr. Uwe Frenz Entwicklung getemed Medizin- und Informationtechnik AG Oderstr. 59 D-14513 Teltow Tel. +49 3328 39 42 0 Fax +49 3328 39 42 99 [EMAIL PROTECTED] www.Getemed.de
Re: DAQ OPC on Windows XP?
Rolf, You were right about security for navigating the hive. I had to give permissions at the HKEY_LOCAL_MACHINE\SYSTEM level before I could navigate further down. However, this did not fix my problems with OPC. I still get the same error message when trying to browse OPC Servers on remote computers. My next step is to try getting this to work on a Windows 2000 computer to see if it makes any difference. Thanks! John >>> Rolf Kalbermatter 01/15/04 03:43AM >>> "John Howard" <[EMAIL PROTECTED]> wrote: >If I try to view the registry keys on the remote computer using 'regedt32', I can not >view anything below HKEY_LOCAL_MACHINE\SYSTEM, even though I have modified security for >the following keys to make sure DCOM should work. >"HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurePipeServers ..." >I gave "Everyone" full access to these keys just to be sure security wouldn't be a problem. I have no experience both with OPC DAQ as well as XP but here it goes: Windows security is tricky at its best, maybe every security is like that. However I do believe that the inability to browse the hive in regedit may have to do with the fact that a key has rights to both view as well as enumerate (expand) it. So in order to browse to your key you might need to allow on all keys from HKLM\SYSTEM down to the one you want to get at the enumerate privilege for the user in question. There are actually a whole bunch of different privileges one can request for when opening a key. Maybe XP has changed somehow that it will in its RPC implementaiton just enumerate the keys hierarchy level for hierarchy level until it gets at the one it needs and that would fail when the enumerate privilege is not enabled. Or DAQ OPC or Windows RPC request one of the many privileges to much when trying to open a key, failing on that one although it may not be needed for the operation in question. As Microsoft has become more concerned with security in the past year or two such small but under certain circumstances far reaching modifications to core elements have become more the rule than the exception. Rolf Kalbermatter CIT Engineering Nederland BVtel: +31 (070) 415 9190 Treubstraat 7H fax: +31 (070) 415 9191 2288 EG Rijswijk http://www.citengineering.com Netherlands mailto:[EMAIL PROTECTED]
Re: Wish list
Here are a few ideas I have thought about ... - when VI's are moved to different directories or renamed within the IDE, their link to other VI's would automatically be maintained - Project level documentation could be created and maintained with the IDE. While browsing VI's and other files in the IDE, some of this documentation could be displayed. - Configuration files such as .ini or database files could be integrated into the project via the IDE. Then LabVIEW programming using these settings and other variables could be streamlined and simplified. (Integration with the MSDE, other databases, and XML could incredibly useful!) John Howard
RE: State machine diagram editor
One tip that I would offer is to consider using a Strict TypeDef Enum to drive your state machine rather than a string. Part of the reason for this is to prevent accidental coding errors such as mis-typing a string or forgetting to implement a case. Also, if you further develop your state machine into a component (based on LabVIEW Component Oriented Design (LCOD) - see the book "A Software Engineering Approach to LabVIEW" for details) which uses a LV2 style global approach to locally contain component information, the Enum turns into a very simple way of accessing the components functions. (perhaps I should have broken up that 'sentence' a little) One minor drawback is having to save the Strict TypeDef Enum in a separate .ctl file. However, I have also taken advantage of this when implementing multiple components which share the same states/functions. Anyway - just my thoughts. John Howard >>> Scott Serlin 01/16/04 12:17PM >>> Thanks for the info. I was hoping for a faster way to add and modify states to the state machine. Also, does anyone have any state machine tips that they can share? I typically use a string driven state machine. I then call out each state from other states. I also use a "nextstate" local variable so that I can reuse particular states in the machine over and over again and not have to replicate the state throughout the machine. Let me and the email list know your tips. Thanks. Scott -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] Behalf Of [EMAIL PROTECTED] Sent: Friday, January 16, 2004 7:48 AM To: [EMAIL PROTECTED] Cc: Info LabVIEW (E-mail); 'John' Subject: Re: State machine diagram editor Rolf K. wrote: > I have looked at the State Machine Toolkit and believe that it is fine for > the standard state machines a lot of users usually encounter but my state > machines usually always tend to be just a tiny little bit more involved so > that I would have to hand edit the generated state machine afterwards > anyhow and once modified manually you can't seem to go back to continue > with the State Diagram Editor. The inability to go back to the editor after you manually edit is true of most computer-aided wizards. The fundamental problem is that the wizard knows how to maintain a state machine under certain conditions and how to modify various pieces when you change something in the editor. Once you introduce a non-standard component, the wizard has no idea how to handle that component as the system changes. The once-you-manually-edit-you-can't-go-back-to-automatic problem covers the State Machine, Express VIs, and a lot of tools built by just about every piece of helpful software ever written on this planet. It requires either a sophisticated AI to recognize all the components that can be introduced into a system by a user or a very restricted set of things the user can introduce. Notice how HTML editors handle non-standard tags. Pojundery, Stephen R. Mercer -= LabVIEW R&D =- "I do not believe that Hell is a physical place. I believe that Hell is an hour of the morning." -- Jan 16, 2004
Re: State machine diagram editor
It lets you create state machine based LabVIEW programs by 'simply' drawing a state diagram. You can switch between diagram view and code view. It probably mostly useful for documentation, where state diagrams are needed. It is not part of any version of LabVIEW, you have to pay for it separately. The price is $995. (the standard price for any LabVIEW tool from NI) I am selfishly hoping that no one buys it so that they may change their mind and include it LabVIEW Professional or something like that. (For $995 I'll keep using Visio to draw the few state diagrams I need) John H. >>> Scott Serlin 01/15/04 05:53PM >>> Can anyone tell me what the state machine diagram editor toolkit does and which version of labview it is part of? Thanks in advance. Scott