Re: [U2] Large DICT affecting I/O

2013-08-08 Thread Ross Ferris
So, how many reports or queries do you have that reference all 2000 dict items 
... really?

When you are working with this 2,000 field item, do you work with the item as a 
dimensioned array, or a dynamic array?

How many of these 2,000 data elements are multi-valued (sub-multi-valued?  
or worse??)  and how many of these are associated multi-values

Performance issues are NOT related to the dictionary size (though dictionaries 
may accumulate debris over time), as the dictionary can be readily resized to 
be the right size -- I would hypothesise that if you had the tools to 
perform a field level data analysis of USAGE (how many fields appear grouped 
on screens  reports together, how often these are used) you would find that 
there are some hot fields that are heavily used every day, whilst others are 
used sporadically every week/month, which could be used to guide any 
re-engineering.

However, this is probably all academic, because in reality you are probably not 
in a position to re-structure the database anyway :-(

Ross Ferris
Stamina Software
Visage  Better by Design!

-Original Message-
From: u2-users-boun...@listserver.u2ug.org 
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of Jeffrey Butera
Sent: Wednesday, August 7, 2013 8:07 PM
To: U2 Users List
Cc: U2 Users List
Subject: Re: [U2] Large DICT affecting I/O

Thanks to those who replied.

1) we have tools to easily edit dictionaries of any size (one of the few 
benefits of Datatel/ Ellucian)

2) we have tools to monitor and resize dictionaries just like any other file ( 
also a Datatel benefit)

It's more of a general question about performance. We import data from the 
Common Application which has almost 2000 data attributes per person. Thus 
having a large DICT isn't sloppy or lazy work on our end, it's a necessity of 
the data.

Thus the question is better stated as:

Is unidata performance better if we stuff all 2000 elements in a single DICT or 
break the data into multiple (eg: 4) files of 500 elements each?  

When we work with this data we need all 2000 elements so is reading 4 or 5 
separate tables any more efficient than reading a single large table of 2000 
elements? 

Jeffrey Butera, PhD
Associate Director for Application and Web Services Information Technology 
Hampshire College
413-559-5556

On Aug 6, 2013, at 11:16 PM, Doug Averch dave...@u2logic.com wrote:

 Hi Jeffery:
 
 We have a client with 6,000 dictionaries items and they have no 
 performance problems  If the dictionary is sized correctly, there 
 generally is no performance hit.  However, editing it with some tools 
 is a pain because it takes quite a long time to read them.
 
 Regards,
 Doug
 www.u2logic.com
 XLr8Dictionary Editor for large dictionary editing
 
 On Tue, Aug 6, 2013 at 9:08 PM, jeffrey Butera jbut...@hampshire.eduwrote:
 
 I'm curious how large of a DICTionary some of you have worked with 
 and, in particular, how very large DICTs can adversely affect applications.
 
 We have a DICT approaching 1500 data elements (no idescs)  - which is 
 quite large for us.  But I'm curious if others have DICTs this large 
 or larger and have no adverse affect on their application performance.
 
 This is Unidata 7.3.4 if it matters.
 
 --
 Jeffrey Butera, PhD
 Associate Director for Application and Web Services Information 
 Technology Hampshire College
 413-559-5556
 
 __**_
 U2-Users mailing list
 U2-Users@listserver.u2ug.org
 http://listserver.u2ug.org/**mailman/listinfo/u2-usershttp://listser
 ver.u2ug.org/mailman/listinfo/u2-users
 ___
 U2-Users mailing list
 U2-Users@listserver.u2ug.org
 http://listserver.u2ug.org/mailman/listinfo/u2-users
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


Re: [U2] Large DICT affecting I/O

2013-08-07 Thread Brian Leach
Jeff

By 'large dictionary' do you mean that the dictionary itself is too large -
has lots of synonyms - or that the data it is describing has that number of
fields and so has become too large for efficient storage?

If the former, I've found people often forget to resize their dictionaries
and the VOC file alongside the data. VOC is particularly vulnerable as
everything goes through it. I've seen a VOC file with half a million entries
in it on one site. Dictionary and VOC are no different storage wise to other
files, they need to be cared for :)

In terms of the records being described, however, that's more of an issue.
Are you getting efficient storage? If data is being prematurely pushed into
overflow - even level 1 - that's bound to cause performance issues. And
UniData doesn't have the hint mechanisms of UniVerse so I'd suspect that
accessing higher order fields would be slow, though I've not benchmarked
that.

Brian


-Original Message-
From: u2-users-boun...@listserver.u2ug.org
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of jeffrey Butera
Sent: 07 August 2013 04:09
To: U2 Users List
Subject: [U2] Large DICT affecting I/O

I'm curious how large of a DICTionary some of you have worked with and, in
particular, how very large DICTs can adversely affect applications.

We have a DICT approaching 1500 data elements (no idescs)  - which is quite
large for us.  But I'm curious if others have DICTs this large or larger and
have no adverse affect on their application performance.

This is Unidata 7.3.4 if it matters.

--
Jeffrey Butera, PhD
Associate Director for Application and Web Services Information Technology
Hampshire College
413-559-5556

___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


Re: [U2] Large DICT affecting I/O

2013-08-07 Thread Jeffrey Butera
Thanks to those who replied.

1) we have tools to easily edit dictionaries of any size (one of the few 
benefits of Datatel/ Ellucian)

2) we have tools to monitor and resize dictionaries just like any other file ( 
also a Datatel benefit)

It's more of a general question about performance. We import data from the 
Common Application which has almost 2000 data attributes per person. Thus 
having a large DICT isn't sloppy or lazy work on our end, it's a necessity of 
the data.

Thus the question is better stated as:

Is unidata performance better if we stuff all 2000 elements in a single DICT or 
break the data into multiple (eg: 4) files of 500 elements each?  

When we work with this data we need all 2000 elements so is reading 4 or 5 
separate tables any more efficient than reading a single large table of 2000 
elements? 

Jeffrey Butera, PhD
Associate Director for Application and Web Services 
Information Technology 
Hampshire College
413-559-5556

On Aug 6, 2013, at 11:16 PM, Doug Averch dave...@u2logic.com wrote:

 Hi Jeffery:
 
 We have a client with 6,000 dictionaries items and they have no performance
 problems  If the dictionary is sized correctly, there generally is no
 performance hit.  However, editing it with some tools is a pain because it
 takes quite a long time to read them.
 
 Regards,
 Doug
 www.u2logic.com
 XLr8Dictionary Editor for large dictionary editing
 
 On Tue, Aug 6, 2013 at 9:08 PM, jeffrey Butera jbut...@hampshire.eduwrote:
 
 I'm curious how large of a DICTionary some of you have worked with and, in
 particular, how very large DICTs can adversely affect applications.
 
 We have a DICT approaching 1500 data elements (no idescs)  - which is
 quite large for us.  But I'm curious if others have DICTs this large or
 larger and have no adverse affect on their application performance.
 
 This is Unidata 7.3.4 if it matters.
 
 --
 Jeffrey Butera, PhD
 Associate Director for Application and Web Services
 Information Technology
 Hampshire College
 413-559-5556
 
 __**_
 U2-Users mailing list
 U2-Users@listserver.u2ug.org
 http://listserver.u2ug.org/**mailman/listinfo/u2-usershttp://listserver.u2ug.org/mailman/listinfo/u2-users
 ___
 U2-Users mailing list
 U2-Users@listserver.u2ug.org
 http://listserver.u2ug.org/mailman/listinfo/u2-users
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


Re: [U2] Large DICT affecting I/O

2013-08-07 Thread Rutherford, Marc
Jerry,

As previous posts have pointed out a Dictionary is no different than any other 
file.   It need to be properly sized for the data it holds.  When properly 
sized will have maximum performance by definition.   No need to 'split' 
files

At TCL: 'file.stat DICT filename'

Example:  'file.stat DICT PARTS'

If file.stat has a new file size recommendation on its last line then run 
(after hours with no users or running processes that may use the dictionary):

At TCL: '!memresize DICT filename new.size' 

Example:   '!memresize DICT PARTS 1009'

Marc Rutherford
Principal Programmer Analyst
Advanced Bionics LLC
661) 362 1754

-Original Message-
From: u2-users-boun...@listserver.u2ug.org 
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of Jeffrey Butera
Sent: Wednesday, August 07, 2013 3:07 AM
To: U2 Users List
Cc: U2 Users List
Subject: Re: [U2] Large DICT affecting I/O

Thanks to those who replied.

1) we have tools to easily edit dictionaries of any size (one of the few 
benefits of Datatel/ Ellucian)

2) we have tools to monitor and resize dictionaries just like any other file ( 
also a Datatel benefit)

It's more of a general question about performance. We import data from the 
Common Application which has almost 2000 data attributes per person. Thus 
having a large DICT isn't sloppy or lazy work on our end, it's a necessity of 
the data.

Thus the question is better stated as:

Is unidata performance better if we stuff all 2000 elements in a single DICT or 
break the data into multiple (eg: 4) files of 500 elements each?  

When we work with this data we need all 2000 elements so is reading 4 or 5 
separate tables any more efficient than reading a single large table of 2000 
elements? 

Jeffrey Butera, PhD
Associate Director for Application and Web Services Information Technology 
Hampshire College
413-559-5556

On Aug 6, 2013, at 11:16 PM, Doug Averch dave...@u2logic.com wrote:

 Hi Jeffery:
 
 We have a client with 6,000 dictionaries items and they have no 
 performance problems  If the dictionary is sized correctly, there 
 generally is no performance hit.  However, editing it with some tools 
 is a pain because it takes quite a long time to read them.
 
 Regards,
 Doug
 www.u2logic.com
 XLr8Dictionary Editor for large dictionary editing
 
 On Tue, Aug 6, 2013 at 9:08 PM, jeffrey Butera jbut...@hampshire.eduwrote:
 
 I'm curious how large of a DICTionary some of you have worked with 
 and, in particular, how very large DICTs can adversely affect applications.
 
 We have a DICT approaching 1500 data elements (no idescs)  - which is 
 quite large for us.  But I'm curious if others have DICTs this large 
 or larger and have no adverse affect on their application performance.
 
 This is Unidata 7.3.4 if it matters.
 
 --
 Jeffrey Butera, PhD
 Associate Director for Application and Web Services Information 
 Technology Hampshire College
 413-559-5556
 
 __**_
 U2-Users mailing list
 U2-Users@listserver.u2ug.org
 http://listserver.u2ug.org/**mailman/listinfo/u2-usershttp://listser
 ver.u2ug.org/mailman/listinfo/u2-users
 ___
 U2-Users mailing list
 U2-Users@listserver.u2ug.org
 http://listserver.u2ug.org/mailman/listinfo/u2-users
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


Re: [U2] Large DICT affecting I/O

2013-08-07 Thread Wjhonson
Provided your dictionaries, in the one file scenario, or the five file 
scenario, are each and all, properly sized THEN
You are more efficient in reading all dict entries from a single dict file  
BECAUSE
Your app is only handling a single file pointer AND
Your internal refs are only viewing groups from a single file and 
registers are only pointing at groups in a single file

So your internal usage pool is much smaller, which also means less internal 
clean up work when you are done

 

 

 

-Original Message-
From: Jeffrey Butera jbut...@hampshire.edu
To: U2 Users List u2-users@listserver.u2ug.org
Cc: U2 Users List u2-users@listserver.u2ug.org
Sent: Wed, Aug 7, 2013 3:07 am
Subject: Re: [U2] Large DICT affecting I/O


Thanks to those who replied.

1) we have tools to easily edit dictionaries of any size (one of the few 
benefits of Datatel/ Ellucian)

2) we have tools to monitor and resize dictionaries just like any other file ( 
also a Datatel benefit)

It's more of a general question about performance. We import data from the 
Common Application which has almost 2000 data attributes per person. Thus 
having 
a large DICT isn't sloppy or lazy work on our end, it's a necessity of the data.

Thus the question is better stated as:

Is unidata performance better if we stuff all 2000 elements in a single DICT or 
break the data into multiple (eg: 4) files of 500 elements each?  

When we work with this data we need all 2000 elements so is reading 4 or 5 
separate tables any more efficient than reading a single large table of 2000 
elements? 

Jeffrey Butera, PhD
Associate Director for Application and Web Services 
Information Technology 
Hampshire College
413-559-5556

On Aug 6, 2013, at 11:16 PM, Doug Averch dave...@u2logic.com wrote:

 Hi Jeffery:
 
 We have a client with 6,000 dictionaries items and they have no performance
 problems  If the dictionary is sized correctly, there generally is no
 performance hit.  However, editing it with some tools is a pain because it
 takes quite a long time to read them.
 
 Regards,
 Doug
 www.u2logic.com
 XLr8Dictionary Editor for large dictionary editing
 
 On Tue, Aug 6, 2013 at 9:08 PM, jeffrey Butera jbut...@hampshire.eduwrote:
 
 I'm curious how large of a DICTionary some of you have worked with and, in
 particular, how very large DICTs can adversely affect applications.
 
 We have a DICT approaching 1500 data elements (no idescs)  - which is
 quite large for us.  But I'm curious if others have DICTs this large or
 larger and have no adverse affect on their application performance.
 
 This is Unidata 7.3.4 if it matters.
 
 --
 Jeffrey Butera, PhD
 Associate Director for Application and Web Services
 Information Technology
 Hampshire College
 413-559-5556
 
 __**_
 U2-Users mailing list
 U2-Users@listserver.u2ug.org
 http://listserver.u2ug.org/**mailman/listinfo/u2-usershttp://listserver.u2ug.org/mailman/listinfo/u2-users
 ___
 U2-Users mailing list
 U2-Users@listserver.u2ug.org
 http://listserver.u2ug.org/mailman/listinfo/u2-users
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users

 
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


[U2] Large DICT affecting I/O

2013-08-06 Thread jeffrey Butera
I'm curious how large of a DICTionary some of you have worked with and, 
in particular, how very large DICTs can adversely affect applications.


We have a DICT approaching 1500 data elements (no idescs)  - which is 
quite large for us.  But I'm curious if others have DICTs this large or 
larger and have no adverse affect on their application performance.


This is Unidata 7.3.4 if it matters.

--
Jeffrey Butera, PhD
Associate Director for Application and Web Services
Information Technology
Hampshire College
413-559-5556

___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


Re: [U2] Large DICT affecting I/O

2013-08-06 Thread Doug Averch
Hi Jeffery:

We have a client with 6,000 dictionaries items and they have no performance
problems  If the dictionary is sized correctly, there generally is no
performance hit.  However, editing it with some tools is a pain because it
takes quite a long time to read them.

Regards,
Doug
www.u2logic.com
XLr8Dictionary Editor for large dictionary editing

On Tue, Aug 6, 2013 at 9:08 PM, jeffrey Butera jbut...@hampshire.eduwrote:

 I'm curious how large of a DICTionary some of you have worked with and, in
 particular, how very large DICTs can adversely affect applications.

 We have a DICT approaching 1500 data elements (no idescs)  - which is
 quite large for us.  But I'm curious if others have DICTs this large or
 larger and have no adverse affect on their application performance.

 This is Unidata 7.3.4 if it matters.

 --
 Jeffrey Butera, PhD
 Associate Director for Application and Web Services
 Information Technology
 Hampshire College
 413-559-5556

 __**_
 U2-Users mailing list
 U2-Users@listserver.u2ug.org
 http://listserver.u2ug.org/**mailman/listinfo/u2-usershttp://listserver.u2ug.org/mailman/listinfo/u2-users

___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


Re: [U2] Large DICT affecting I/O

2013-08-06 Thread Doug Averch
Hi Jeffery:

We have a client with 6,000 dictionaries items and they have no performance
problems  If the dictionary is sized correctly, there generally is no
performance hit.  However, editing it with some tools is a pain because it
takes quite a long time to read them.

Regards,
Doug
www.u2logic.com
XLr8Dictionary Editor for large dictionary editing

On Tue, Aug 6, 2013 at 9:08 PM, jeffrey Butera jbut...@hampshire.eduwrote:

 I'm curious how large of a DICTionary some of you have worked with and,
 in particular, how very large DICTs can adversely affect applications.

 We have a DICT approaching 1500 data elements (no idescs)  - which is
 quite large for us.  But I'm curious if others have DICTs this large or
 larger and have no adverse affect on their application performance.

 This is Unidata 7.3.4 if it matters.

 --
 Jeffrey Butera, PhD
 Associate Director for Application and Web Services
 Information Technology
 Hampshire College
 413-559-5556

 __**_
 U2-Users mailing list
 U2-Users@listserver.u2ug.org
 http://listserver.u2ug.org/**mailman/listinfo/u2-usershttp://listserver.u2ug.org/mailman/listinfo/u2-users



___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users


Re: [U2] Large DICT affecting I/O

2013-08-06 Thread Ross Ferris
I would suggest that rather than the DICTIONARY being a performance problem, 
which can be easily overcome with a suitably sized file, the REAL impact may be 
in day to day processing where this file might be getting referenced to display 
a dozen or so fields, but each I/O is actually pulling in much, Much, MUCH 
more, pulling in 2 orders of magnitude more fields than it needs to

Sure, UD/UV/MV ALLOWS you to DO this sort of thing, but I would respectfully 
suggest that you shouldn't necessarily do so. If people adopt bad habits like 
this, they tend to just keep on evolving, and I suspect if performance is 
currently OK (thanks to ever faster hardware), there is probably no desire to 
go through the pain associated with fixing this (or budget :-)

-Original Message-
From: u2-users-boun...@listserver.u2ug.org 
[mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of jeffrey Butera
Sent: Wednesday, August 7, 2013 1:09 PM
To: U2 Users List
Subject: [U2] Large DICT affecting I/O

I'm curious how large of a DICTionary some of you have worked with and, in 
particular, how very large DICTs can adversely affect applications.

We have a DICT approaching 1500 data elements (no idescs)  - which is quite 
large for us.  But I'm curious if others have DICTs this large or larger and 
have no adverse affect on their application performance.

This is Unidata 7.3.4 if it matters.

--
Jeffrey Butera, PhD
Associate Director for Application and Web Services Information Technology 
Hampshire College
413-559-5556

___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users
___
U2-Users mailing list
U2-Users@listserver.u2ug.org
http://listserver.u2ug.org/mailman/listinfo/u2-users