Re: Frequency spectrum of IFFT output

2004-02-14 Thread LocalDSP
First a small clarification; IFFT stands for Inverse-FFT that is a
conversion that converts a complex frequency domain spectrum back to a
time domain signal. When you write that the output is "summation of N
orthogonal sinusoids" it sounds like it is a time domain signal, that
is the result of an IFFT operation. However you also write that he
output "represent complex weight for corresponding sinusoidal basis
function" which sounds more like a frequency domain representation
that is the result of an FFT. Can you explain in more details what you
try to achieve?

A spectral component represented by a complex number (a+ib) can be
converted to the two fundamental values needed to represent a
sinusoidal (sine) signal in the time domain. These are the amplitude
of the sine tone (r) and its start phase (Theta). The conversion
to/from "Complex" from/to "Polar" can be done in LabVIEW using the two
primitives "Polar to Complex" and "Complex to Polar". These are
located in the Numeric>>Complex palette.

The ouptut of an FFT is originally a complex array, but it is usual to
convert the result into two real arrays, one representing the
"Magnitude" (r) and one representing the "Phase" (Theta or Phi). Then
you can graph the Magnitude and the Phase results on two different
graphs.



Re: I am looking for a indicator like a FLASH LIGH, just show my

2004-02-14 Thread Greg McKaskle
> Oh I agree with that. Like it said, this is just the quick and dirty
> way of doing it. You have to lock up the computer to stop the
> animation (something which we've managed to do a couple of times). A
> real progress indicator is much better. Instead of just saying "Please
> Wait" on screen you can also add an animated GIF.


There are two types of progress bars.  I'll call them qualitative and 
quantitaive.  The qualitative bar only indicates that progress is being 
made, but can't predict when it will complete.  This is appropriate when 
waiting on a computer or other resource.

The qualitative progress bar shows a percentage or n of m display so 
that it is possible to know rougly when it will complete.  And please 
don't do multiple progress bars showing individual steps, the old PC 
installers with their three progress bars, each bouncing around, and 
still giving you no idea of when the installer will complete.

To do the first, use a gif, a ring, a message, etc.  To do the second, 
use a slider, meter, progress bar, etc.

Greg McKaskle




Re: If you take a ''Meter'' indicator (the one with the rainbow...

2004-02-14 Thread Dargar
I'm so sorry Ben! I didn't notice what had happened to my question
"subject field".

The question in my subject is not mine at all. When I searched the
forum for an answer to my problem (changing the color) I read the
replies to the subject reflected in my subject, and when I didn't find
the answer, I posted my question, but forgot to change the subject
field!

I'm sorry for the confusion!! I wish I could edit it, but this forum
doesn't support editing that I know.


thanks for showing me how to change the colors! The property node is
huge, even with your example to show me, it took a little while to
find it in all the sub-menus. :)

Thanks, and sorry for posting with the wrong subject. If there are any
moderators here that can change it, please do so!!



RE: RS232 Displays for LVRT

2004-02-14 Thread Jim Kring
Sergey,

Here are some pretty high end ones:

http://www.xantech.com/splcd/

If you are looking for something with fewer bells and whistles you can use a
dot matrix LCD display with an RS232 interface.  Check out  Matrix Orbital:

http://www.matrixorbital.com/

-Jim

> -Original Message-
> From: [EMAIL PROTECTED] 
> [mailto:[EMAIL PROTECTED] On Behalf Of Sergey Liberman
> Sent: Friday, February 13, 2004 9:46 PM
> To: Info-LabVIEW
> Subject: RS232 Displays for LVRT
> 
> 
> Hi all,
> 
> Does anyone know of any display (preferably touchscreen, but not
> necessarily) that can be controlled over RS232 interface 
> (rather than a video board or chipset)? I need to connect a 
> local display to a system running LabVIEW RT, and it looks 
> like RS232 is the only option that I have. I don't need a VGA 
> resolution - smaller resolution, black-and-white LCD will be 
> fine. I know there was at least one company that had products 
> like that - it had a booth at NIWeek a couple years ago - but 
> I can't recall its name. TIA,
> 
>Sergey Liberman   [EMAIL PROTECTED]
>Solidus Integration   http://solidusintegration.com
> 
> 
> 




Re: Acquire Analog and Digital inputs at the same time with 6025E card

2004-02-14 Thread Ben
The user manual for that device

http://www.ni.com/pdf/manuals/322072b.pdf

indicates that it uses programed I/O to interact with the DIO lines.
This could be a major complication!

Are you trying to acquire timng info from the digital lines and if so,
what kind of precision do you need in your timing?

Ben



Re: Can we acquire the Analog input and Digital Input at

2004-02-14 Thread chenchen
Thanks Ben. I use 6025E. All the analog channels are
used. And there are several other digital signals.I have post a new
question here.



Re: i created boolean references in my main vi block diagram and...

2004-02-14 Thread Ben
Expanding and clarifying what BJD said;

After you create the temporary sub-VI that BJD mentioned, open its
front panel and copy the reference control that LV created when it
created the sub-VI.

This reference control will be correct class etc that you need. Use
the control to replace the original control that you were attempting
to wire up.

The technique of "create sub-VI...copy" always works for me.

There is one more thing that you should watch out for.
The mechanical action of the boolean can not be set for latch action
when attempting to read the value using a value property node.

Trying to help,

Ben



Re: Problem with array processing !!!

2004-02-14 Thread Ben
Expanding on the correct answers given by Wiebe and Mads;

I suggest you conduct the following experiment.

Put a case structure around the "output array" indicator. Drop a "=0"
node from the Comparison palette and wire its input from the Remainder
output of your quotiant & remainder function. THe output of the "=0"
drives your new case selector terminal.

Change your array size to 1,000,000 and watch your loop count.

The modifications I outlined above, will update you array indicator
"once each time through" your buffer. It will be quite obvious when
the indicator is being updated and when it is not.

Array indicator update are always expensive.

Ben



Re: Instrument I/O assistant doesn't show terminals

2004-02-14 Thread Offramp68
Yes Tommy I tried all the following things:

1. Repeated the same procedure on a different PC (a notebook with a
fresh installation, WinXP)
2. Opened a blank VI, put the I/O assistant alone, repeated the
procedure
3. Open the template "Read and acquire", repeated the procedure

In all the cases, no terminals appeared

I definitely agree with you. My LabView is failing in the final
generation code stage. I cannot understand if there is a problem in
the way I prepare the procedure (but I don't think since everyhting
works fine, including when I parse the data) or a problem in the
LabView installation (maybe I excluded unintentionally some essential
piece of the program?)

If this can help, I noticed that when I select the I/O assistant and
set the "Select Instrument" port and timeout, the lateral option "code
generation type" is not active, while in the flash tutorial in the NI
site I saw that there it should be active

Thank you for your help though



waveform graph property nodes

2004-02-14 Thread Doug1974
I have written an application which continuously acquires data from my
daq card (2000 samples @ 2000sam/sec).  I am appending each iteration
and displaying the entire waveform on one waveform graph.  I have a
second waveform graph that I'd like to use to dynamically select a
particular range (so I can zoom in to a small portion of the larger
waveform).  To do this I require the ability to modify the XScale at
runtime.  I have tried using the Waveform Graph Property Node
(XScale.Min and Max) and have noticed that these cannot be manipulated
at runtime.  I browsed some example code and found one example that
uses the property "XScale.OldRange", yet I cannot seem to find this
property in the list available when you right-click on the node.
Where is this property located  I have copied and pasted the
property node from the example and pasted it in my own application,
and linked it to the graph I'm trying to manipulate and it does work;
however I'd like to be able to select the property myself.

The example to which I'm referring is "Two Channel Oscilloscope.vi"
found in the "Find Examples" menu in Labview.  ***I'm currently
working with LV6.1.



Re: Can we acquire the Analog input and Digital Input at

2004-02-14 Thread Ben
Hi Chenchen,

It usually simplifies things greatly if you use analog inputs to
monitor both the analog signals AND the digital signals.

If you abosulutely HAVE to use digital lines, then the answer is,
DEPENDS

With the correct hardware and depending on how fast you want to
collect data and how closely the measurements have to be syncronized
in time, it can be done.

But first, concider using analog inputs to monitor all signals.

If you need to move forward with both analog and digital acquisitions,
then post a new quesion and give us details on the hardware you are
using and the other details we will need to advise you.

Trying to help,

Ben


http://exchange.ni.com/servlet/ProcessRequest?RHIVEID=101&RPAGEID=261&HRedirected=True&HUserId=101_3529&RFORMNUMBER=6";>Ben
Rayner
http://volt.ni.com/niwc/common.jsp?page=products_certification_cld&node=10638";>
Certified LabVIEW Developer 
www.DSAutomation.com



Re: read strings INTO enum?

2004-02-14 Thread William Gilbert
Belated thanks Bruce! You saved me a TON of work already. I've been 
editing the state list(too many times) in Excel, pasting in in to a 
string control and looping it into a text ring, which then gets pasted 
into a strict typedef and converted to enum.

I did a project of similar complexity back in '97 either before I knew 
about typedefs, or maybe before they were availiable. THAT got a bit 
labor intensive.

-BG

Bruce Ammons wrote:
Bill,

You can't do it with an enum, but you can do it with a text ring.
Create a small vi that just sets the strings[] list for a blank text
ring.  Copy the text ring into your program and convert it to an enum.
Ta-daa!
Bruce

--
Bruce Ammons
Ammons Engineering
www.ammonsengineering.com
(810) 687-4288 Phone
(810) 687-6202 Fax


-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On
Behalf Of Bill Gilbert
Sent: Wednesday, January 28, 2004 12:27 PM
To: Labview Listserv posting
Subject: read strings INTO enum?
In th past when using an enum for a state variable, I 've read the 
strings out as an array from a property node. Works fine.

I got to thinking that if I could reverse the process and paste a list 
of strings into a string control(from excel), generate a string array, 
and read them INTO a blank enum, I could save myself a bunch of typing. 
I can get the string array, no problem, but I can't get the enum to 
accept them. I get a broken Run arrow, with the message "Enum has 
duplicate values", usually two blank entries.

Am I just barking up the wrong tree?

--BG

Bill Gilbert, EM Tech
UMN School of Physics and Astronomy
Tel 612 624 4870
Fax 612 624 4578





--

Bill Gilbert, EM Tech
UMN School of Physics and Astronomy
Tel   612 624 4870
Pager 612 622 0174
Fax   612 624 4578  
[EMAIL PROTECTED]



Re: Does LabVIEW support Shared Memory File created by VC++?

2004-02-14 Thread Martin Riddle
No, unless you are using VXI and sharing memory (eduacated guess, I never used VXI).
But, you can call the DLL and retrieve the data from a call to the file mapping object 
in the dll.
So, yes you can have the data cross processes.

Have fun

"Stars" <[EMAIL PROTECTED]> wrote in message news:[EMAIL PROTECTED]
> Can LabVIEW program read and write online data stored in a Shared
> Memory File created by a VC++ application using Windows SDK Functions:
> CreateFileMapping, OpenFileMapping, MapViewOfFile ... when both
> programs are running?  If yes, how? What vis? Any examples?
>
> Thank you very much.
> Stars





Re: Filtering waveform?

2004-02-14 Thread LocalDSP
A correction to my previous comment. Linear or Spline interpolation is
not the way to go if you wish to decimate your data. The correct
technique is, as you mention, to filter your signal before data
decimation (dropping samples).

You are considering to use a Bessel filter. I assume this is because
you want to minimize the time domain distortion of your signal. This
could work but Bessel filters are not very steep and it will be
difficult to keep a reasonable bandwidth and protect against aliasing
at the same time.

I would recommend you to use an FIR filter instead of the IIR-Bessel
filter. The FIR filter is phase linear and well-suited for data
decimation. You'll find the high-level VI in the Analyze>>Waveform
Conditionning palette.



Re: Instrument I/O assistant doesn't show terminals

2004-02-14 Thread TommyTestBoy
It sounds like the I/O Assistant is failing to generate code onto the
block diagram.  Try dropping a new I/O Assistant Express block on your
diagram (under the Instrument I/O palette), configuring it the same
way as the one in the example, and press OK.  Does the same thing
happen?



Re: How plotting on a digital waveform graph the bit transition in function of the time

2004-02-14 Thread St Augustine
My name is vijay. For the scaling on the x axis,can u please tell me
what X0 and delta X denotes. What scaling factor are they reffering
too?
thank in advance. Also in order for you to see the binary
representation of your digital inputs u can open up the "read from one
digital port" VI located under data acq, digital i/o, immediate
digital i/o. Then u can wire a graph at the output from the pattern
node before number is converted to boolean. Hope that helps jonathan!



Re: How to change the properties of waveform graph from time domain

2004-02-14 Thread St Augustine
i have a similiar problem.So what you are saying is that by just
changing the scale to decimal after the FFT is taken means that the
unit is in frequency?



Re: XY graph y scale text color change in LV6.1

2004-02-14 Thread Mads
Create a property node linked to the graph (right-click on the graph
and select create property node), expand it to have two properties and
make the first property the Active Y scale property, wire the index of
the right hand Y scale to it (could be 0 or 1 if you have two). Now -
the second property you need to set is the YScale.TextColor property
which can be found as YScale->Marker Colors->Text Color in the
properties menu (pops up if you click on the property node terminal).
Wire a color value to that and you're there.



maximum input frequency of the PCI-6014?

2004-02-14 Thread St Augustine
I am using the PCI-6014 configured for digital inputs. At low
frequencies my quantized waveform is very smooth however wheni
increase my input frequency, teh waveform becomes highly distorted. Im
sure its not my A/D converter becomes i am sampling more that eight
times the input frequency. Can you please help me!



Re: Serial Port

2004-02-14 Thread Rolf Kalbermatter
"Subramanian, Shankar, K" <[EMAIL PROTECTED]> wrote:

>I have done data acquisition using NI DAQ card for one of my projects. Right
>now, have a small project in which data needs to be acquired from a serial
>port. The system has a temperature controller unit which sends data to the
>serial port and needs to be collected using LabView. 
>
>How different is it to configure and acquire data using serial port rather
>than NIDAQ card. Could anyone help me out in this? Also, this is going to be
>done using LabView 7 - Evaluation Version. I guess there are no restrictions
>while we use evaluation version. I hope all the features of a full
>development system could be used. 

Hmm, how do you suppose National Instruments making money to pay for the
development of LabVIEW if the Evaluation version could do the same things
than a normal development system?

As far as replacing NI-DAQ calls with calls to communicate with your serial
device, that is quite easy if you have the VIs to talk to your serial device
already. Otherwise it is about writing a VI or two to communicate over the
serial port with whatever protocol your device has. This part can be tricky
because in the serial port not even the low level communication parameters
are always the same. RS-232 stands for Recommended Standard and it is nothing
more than that. Each manufacturer has its own ideas about how to interprete
that standard and the communication protocol itself is not standardized in
any way. You have almost as many protocols as there are manufactureres out
there.
Once you got that VI working it is basically just replacing the DAQ VI with
your device communication VI.

Rolf Kalbermatter
CIT Engineering Nederland BVtel: +31 (070) 415 9190
Treubstraat 7H   fax: +31 (070) 415 9191
2288 EG Rijswijkhttp://www.citengineering.com
Netherlands mailto:[EMAIL PROTECTED]
 




Switch Screen saver OFF

2004-02-14 Thread Sergey Yakovlev
Could someone suggest me How I can supress programmatically the Windows
screen saver when my vi is running?





Does LabVIEW support Shared Memory File created by VC++?

2004-02-14 Thread Stars
Can LabVIEW program read and write online data stored in a Shared
Memory File created by a VC++ application using Windows SDK Functions:
CreateFileMapping, OpenFileMapping, MapViewOfFile ... when both
programs are running?  If yes, how? What vis? Any examples?

Thank you very much.
Stars



RE: low level device drivers

2004-02-14 Thread Rolf Kalbermatter
"Howard, John" <[EMAIL PROTECTED]> wrote:

>I am considering using a PCI card (Sabtech NTDS interface card for those
>interested), and it doesn't seem to have a nice simple DLL like I am use to. 
>Instead it has a kernel mode Windows 2000 device driver, and I am expected to
>use "...normal Windows I/O system calls..." to talk to it.  The driver is
>written to be part of the Serial Port driver class, BUT it can not be treated
>simply as a serial port since "...there are several restrictions on the I/O
>calls...", and "...it has its own set of DeviceIoControl functions."
>
>I know so little about Windows device drivers that I don't really even
>understand the paragraph I just wrote.  Does this mean I can't talk to the
>card directly from LabVIEW?  If I can, is it difficult for someone who uses
>DLL's fairly often?  Can I use the Win32 API to talk to it?  Does anyone have
>any suggestions on how to approach this?

Well, you load a device driver by calling CreateFile() with the device driver
name (usually something like .\\MyDevice). Newer device drivers do not use
a hardcoded device name but a class GUID which you have to translate into the
appropriate device name through the setupapi functions before loading the driver.

After that you basically call DeviceIOControl() and eventually ReadFile() and 
WriteFile() on the handle returned from CreateFile(). At the end you close the
handle with the CloseHandle() function. It is not impossible to call all this 
functions directly from LabVIEW with the Call Library Node but especially the
DeviceIOControl() gets nasty very fast.

So it is possible but unless you only have to implement one or two simple
DeviceIOControl() calls it would be probably better to implement the interface
in a DLL with a more LabVIEW friendly call interface.

Rolf Kalbermatter
CIT Engineering Nederland BVtel: +31 (070) 415 9190
Treubstraat 7H   fax: +31 (070) 415 9191
2288 EG Rijswijkhttp://www.citengineering.com
Netherlands mailto:[EMAIL PROTECTED]
 




Re: Serial Compatibility VI's not working in Labview 7.0

2004-02-14 Thread fahlers
you have to put serpdrv into the folder wherer lvrt.dll (the LV
runtime engine) is located, that worked for me



RE: other persons software (was: why I hate express vis.....)

2004-02-14 Thread Rolf Kalbermatter
"Scott Hannahs" <[EMAIL PROTECTED]> wrote:

>Opening diagram extends over about 7 screens horizontally and 6 screens vertically
>(1600X1200 pixel screens).  Ok, bad enough, now open the hierarchy window and I
>see a main VI and about 200 express nodes, that I can't open without converting
>to sub-VIs.  There were 4 actual real non-system subVIs in the whole thing with
>main VI using about 12 Mb of disk space.
>
>Unfortunately the programmer had hardwired the path separation character so it
>failed on all platforms except the development one.  The character was in 26
>different places and each place was wired into somewhere between 1 and a dozen
>string building functions.

Sounds almost like another system I had the "joy" to upgrade to the newest
LabVIEW verson lately. The original programmer was a seasoned C programmer
with little to no knowlegde about LabVIEW. Lots of globals and locals all
around, four cascaded state machines all in the main VI and implemented as
a case structure with around 100 states each, sequentialized in a sequence
structure even though only one state machine could be active at the same
time as they were all dependant on the same state variable which of course
was a local too, as shift registers are just to darn sophisticated.
The main VI was around 10MB and all the subVIs (maybe around 100) together
were less than that. Opening the diagram on the original computer, an
industrial computer of some two years ago took 20 or more seconds before
it would show up! I was tempted to throw it all away but as the state
machines itself represented the entiere process intelligence of the
system with little or no documentation besides of the LabVIEW diagram
I reconstructed the system from there with each sub state machine in
its own subVI and removing as many globals and locals as possible.

After some four weeks of programming it is now in the final testing fase
and seems to work as originally intended with a few quirks removed and
optimizations in place.

Rolf Kalbermatter
CIT Engineering Nederland BVtel: +31 (070) 415 9190
Treubstraat 7H   fax: +31 (070) 415 9191
2288 EG Rijswijkhttp://www.citengineering.com
Netherlands mailto:[EMAIL PROTECTED]
 




Dual Monitors

2004-02-14 Thread Jack Hamilton
Matrox G550 card issues:

I will add that there are some dual monitor quirks outside of LabVIEW. In
windows, Microsoft Photo Editor will occasionally ‘break’ and either be full
screen or minimized and do nothing in between. I’ve had to uninstall and
reinstall the program to fix it.

Some 3D applications rendering applications like Maya 4D and Bryce 3D want
to own the entire screen they don’t non-full screen mode so you can’t resize
the program window and pull it to one screen. It’s quite irritating to have
3” gap of the monitor bezels going right through the middle of the
application. Many programs that want to “auto center’ including LabVIEW vi’s
so configured will be spread across both monitors.

I’ve run into most problems when I mess with the ‘smart’ features of the
card like having the card try to fool an application that one screen is
dominant and maximize to only that monitor. This is how MS Photo Draw broke.

It’s quite easy with the Matrox card to turn off the second monitor without
a reboot to accommodate those single screen applications.

When you install the Matrox G550 card – be sure to do it yourself! – A
client had a IS guy build a couple of machines with G550’s, during the
driver install – it asks if you want “Dual monitor support’. If you say NO –
which he did – you can’t enable it after the fact (of I could not find a
way). I spent hours messing around then tried an uninstall  - reinstall and
discovered this prompt.

One very neat feature and I don’t know if it’s intentional is that if you
open a VI on a single computer that was written and saved on the Right
monitor of a dual monitor system – will show to the very right – but still
draggable to the single screen. Otherwise it would be quite a nuisance to
not be able to get to a front panel or diagram because you forgot to drag to
over to the left screen first before and VI save.

I love my G550 and I have looked at the APEX 4 monitor card – but I don’t
know much about APEX – if Matrox came out with one I’d buy it.

As far as monitor splitters – you really need the thick shielded and
insulated Video cable $$$ – you’ll get a fuzzy screen and a headache if you
use a cheap Video extension or switchbox cable.

Jack Hamilton
Hamilton Design
[EMAIL PROTECTED]
www.Labuseful.com
714-839-6375 Office