Hello,
You may want to post your vi's on this Discussion forum and someone
will probably be glad to convert them.
You can convert LabView 7 vi;s to version 6.1.
>From the
File menu, select
Save with Options, then select
Save For Previous
However, some new features of version 7.0 will not convert to 6.1
Hello,
I just used the application builder of version 7.1 and created an
application. I included my application.ini file as a support file
during the build process.
Then I installed the application.
The .ini file was not replaced by LabView.
It is still the same behavior as in previous versions o
In principle, Yes,
However, how much true that is depends on how LabView (R/D)developers
are implementing it, and things have been changing between LabView
versions, to different degrees, all the time.
If the subvi does not contain data that are dependent on a previous
run , such as local variable or shift register data, then there is no
need to make the vi reentrant.
Hello Paul
I maximized the vi, and yet it did not duplicate what you are seeing.
However, NI engineers are aware of a different bug (that I reported)
in version 7.0/7.1 that has to do with a window being maximzed
I was not able to reproduce it on my win 2000/ LabView 7.1
Is this happening with this particular vi, or any other vi ?
I have installed LabView version 7.1, and I do see the Daq vi's in the
vi.llb. You have to install the drivers from the
CD
Device Driver Reference CD
If this does not work, then it has to be something else
(like in version 7.0)
By the way, I do not have any daq hardware on my development compute
Your example suggests you are using the report generation tool kit.
Then you have to add
_Word Dynamic Vis.vi from _wordrdsub.llb and
_Excel Dynamic VIs.vi from _exclsub.llb
as dynamic VIs on the source files Tab when you build the application
You find these on Labview directory\vi.lib\addons|_o
Good Job, Dennis
Keep the good work going. It is always beneficial for the rest of us
to get ideas from the experts
I have been saving my data to text-based files,of course, keeping
track of how the data are formatted, thru flags. And I usually create
extra fields for future expansion. It is tedious, but the file can be
read by any spread sheet and/or text editing software.
I use datalog files to handle data t
It could be the order of components of the clusters in the called and
callee vi's is different.
In the called subVi, right click on the cluster and select
reoder controls in cluster, and note the order.
Then , when you wire the bundled data cluster to the subVi, follow the
same order, or else som
<>
I see that, looks like a bug to me in version 7
One way I handled that is to normalize my several Y axis data, and
plot them in % (percent) (scale 0 to 100)
so If one column data is voltage, another is current, temperature,
etc, I divide each column by its predetermined or assumed maximum,
and this will be the hundred percent point.
Then I use
my understanding is that you are generating a sine wave (on one of the
anlogoutput channels) and you are trying to read the signal.
Since you can verify the signal at low frequency, but not at high
frequency, it looks like you should increase the scan rate of the data
acquistion (if you can, set it
"In other words, does the reentrancy get passed up the chain?"
My understanding of reenetrancy says it should.
I never used it like that before, but I created little program that
duplicates the scenario you put in the post, and it worked correctly,
as I expected. I passed data thru terminals to g
It depends on how you intialize the algorithm and on whether you are
reversing the sequence or not
Refer to the following discussion thread.. Has vi's that are purely G,
no .dll
http://exchange.ni.com/servlet/ProcessRequest?RHIVEID=101&RPAGEID=135&HOID=506500080021AF&UCATEGORY_0=_49_%
>From the edit menu, select
Set Tabbing order
changing the tabbing order will also change the order the variables
are listed when you want to list or select a local variable
I assume you are trying to find a gaussian ditribution function to fit
to the data. All you need is to find the average and the sample
standard deviation of your data and then build the Gaussian function.
You can plot that, you can also plot a histogram of your data to
compare to the Gaussian plot
Are you using serial VISA ?
if so, you have to install the VISA run time engine, or when you
build the executable (application builder) enable the selection for
serial port support.
If you mean by the brightness is to turn the LED's ON/OFF,then you can
use the DIO's of the PCI 6024E to turn on/off (or connect/diconnect)
the power supply to an LED
The card has 8 DIO's , and you have to configure them for digital
output
Right click on the sub vi
Select Sub Vi Node Setup
Then
deselect the item
Open Front Panel When loaded
It should run as you expect
It seems that the gain setting is not set right.
For the channels that are 0-100 mv, set the gain to min=0, max=100mV
and
for the 0 to 10 v, set it to min=0, max=10 volts
There could be issues (with settling time)at high scan rates when the
gains are set differently, but for most part, it should
To eleminate the possibility of errors, you need to convert the
application to a version fully compatible with windows XP, like 6.1 or
7.0 (6.02 should do it)
Would version 5.1 work with XP, NI do not gaurantee that. However, I
had installed apllications written in version 5.1.1 and worked fine
u
Once you checked that the LabView fonts are set to "plain",
ToolsOptionsFonts>>Font Style
then the issue should be solved.
If not, go to windows settings>>>appearance
Then select the item# icon, menu, menu bar etc..
I am not sure which is the one to check,
but, for each item selected, chec
It has been like that since the days of LabView version 3.1 (when I
started using LabView.
This is also true for controls.
I have this note hanging on my cubicle since late 1996
The controls that are not connected use their default values
The outputs (indicators) that are not connected use their
some fonts are set depending on how they are in windows settings
(assuming Windows OS).. Check on these settings
Also, in Labview menu, select
Tools
Options and then make sure the fonts are not set to bold. You may
have pressed some keyboard key that automated that
An indicator that is connected to a terminal (if not updated) will
pass its default value.
But why you have to run the vi twice to make the indicator restore to
its default value, that I do not know
This is also an issue in version 6.02 that I have seen before (but
never been woried about)
If I am
Congratulations Ben,
Your comments and suggestions are always helpful
Imad
It does not appear to me that this is a conversion error. It is more
of what the vi is trying to do than anything else
The max count (i32 representation) is 2147 483 648
and you a reporting an error or value of 2147 352 565
It may be related to the max count or some data inside the file
The default is double (float64)
But you can create an array, then
right click on the array icon
Representation ---> select Single (float32)
Use the Key Focus property to set the focus on the input string.
The input string will be automatically selected
If you set the vi property-execution to user interface, it may help
Right click on the vi icon,
select vi properties
execution- preffered execution system
select the existing property node / local variable.
Drag the the selected items with the mouse while holding the control
key.
Then release the mouse. A new copy is created
Just to report on the issue further and how I was able to solve the
PROBLEM.
After experimenting with many vi settings, eventually I found out by
pressing the maximize/restore icon box (Top right border of the vi),
and then saving it, the problem goes away (in this case it is the
debug_cntr-mnu.vi.
Today I ran the vi on a windows XP, Pentium 4, LabView 7, system,
and it showed the same behavior.
Now, How did all this happen ?
I converted an application from LabView 6.02 to 7, and started seing
that behavior. Similar applications in Labview 5.1, 5, 4.1, 4, 3.1
worked correctly. I also conve
If you mean by labels of enum, the strings, then you cannot do that
for an enum. This is how an enum is
You can use a ring and change / set the strings using the strings
attribute property. For an enum type, this property is a read only.
However you can create a enum type-definition and change its
In assume you are using serial VISA
The normal sequence of RS232 communications that I use is
1-Flush communication buffers both read and write
2-Write the string
3-wait for some ms (depending on the instrument used)
Then read the bytes
and I never had problems with that.
The only catch to this method is
if for instance, the normal desired setting for a control is ENABLED,
and during the vi run, it is set to DISABLED, and the vi is saved
(before the method is called). The default setting will be DISABLED
assuming that the vi is then closed.
When the vi is later load
Refer to the following discussion thread
http://exchange.ni.com/servlet/ProcessRequest?RHIVEID=101&RPAGEID=137&HOID=506500050096110100&HTHREAD=70038&UCATEGORY_0=_49_%24_6_&UCATEGORY_S=0
In general, the objective is to find the optimum signal that would
make the error mimimal.
Start wi
If I understood you correctly, you are reading the two current
positions (1)-up-down and (2) left-right, and you have two set
points (one for the up-down and the other for the left-right position,
read by the master. And the objective is to make the readout for the
positions as close as possible
the Engine Momentum Subvi should finish execution with each iteration
of the loop in the DAQ.vi
Therefore , connect a constant Boolean set to True to the condiotional
terminal of the while Loop of the Engine Momentum Subvi (instead of
the stop value from the main vi)
The real issue here is if you convert existing code written in LabView
6/6.1, it should work in the same way without having to worry about
the little things, that makes you examine and rewrite some of the code
I agree with you that serial vi's should not have been dropped ( I
assume NI has their own good reasons for doing that though).
I have the same issue that you have, and I have not upgraded to
LabView 7 yet, waiting for release 7.01 or 7.02 that fixes all the
little things I have been reading about.
http://exchange.ni.com/servlet/ProcessRequest?RHIVEID=101&RPAGEID=135&HOID=506500080021AF&USEARCHCONTEXT_CATEGORY_0=_49_%24_6_&USEARCHCONTEXT_CATEGORY_S=0&UCATEGORY_0=_49_%24_6_&UCATEGORY_S=0
There has been a recent discusion on the implementaion of a CCITT
algorithm, Refer to the abov
45 matches
Mail list logo