RE: File server structure and perms

2010-08-09 Thread Ken Schaefer
Hmm - I don't think this is hard to do (based on your requirements below). A 
few groups and NTFS permissions accomplishes this.

You could have some folders setup as a template, and an admin would copy those 
folders for each new client and just run a script that re-ACLs the folders...

(I am assuming that the same Testers have access to all Test Results folders, 
rather than there being a group for each new client)

Cheers
Ken

-Original Message-
From: Charlie Kaiser [mailto:charl...@golden-eagle.org] 
Sent: Tuesday, 10 August 2010 12:47 PM
To: NT System Admin Issues
Subject: File server structure and perms

I've been tasked with setting up a file server structure for a client. SBS 
2008. We normally set up Home, Shared, and Public. Client wants a completely 
different paradigm. They want a master folder for each of their clients, with 
subfolders below that which have varying permissions. So for example:

Client master folder
->test results
->notes
->estimates
->contracts

Each of the subfolders would have different perms; techs writing data to test 
results would not have access to estimates, for example.

They also wish to have a template setup so that each time they add a client, 
they can put this structure in place and have the appropriate permissions in 
effect.

I don't see a simple way to do this. It looks to be highly IT-intensive, which 
is not what we nor the client would like.

It almost sounds more like a sharepoint thing, although I have little 
first-hand knowledge of sharepoint deployments.

Any suggestions?

Thanks!

~ Finally, powerful endpoint security that ISN'T a resource hog! ~
~   ~



File server structure and perms

2010-08-09 Thread Charlie Kaiser
I've been tasked with setting up a file server structure for a client. SBS
2008. We normally set up Home, Shared, and Public. Client wants a completely
different paradigm. They want a master folder for each of their clients,
with subfolders below that which have varying permissions. So for example:

Client master folder
->test results
->notes
->estimates
->contracts

Each of the subfolders would have different perms; techs writing data to
test results would not have access to estimates, for example.

They also wish to have a template setup so that each time they add a client,
they can put this structure in place and have the appropriate permissions in
effect.

I don't see a simple way to do this. It looks to be highly IT-intensive,
which is not what we nor the client would like.

It almost sounds more like a sharepoint thing, although I have little
first-hand knowledge of sharepoint deployments.

Any suggestions?

Thanks!

***
Charlie Kaiser
charl...@golden-eagle.org
Kingman, AZ
***  




~ Finally, powerful endpoint security that ISN'T a resource hog! ~
~   ~


Re: ssh publishing on ISA

2010-08-09 Thread S Powell
ROCK ON!

FYI Y'all

the correct (for us YMMV) answer was:::

ISA 2006
_Publish Non-Web server protocol
==> to internal Server IP address
selected Protocol ==>  (user defined  "inbound SSH" port 22 TCP inbound)
Listen on ==> External

badda bing

thank you John and Devin.


Google.com  Learn it. Live it. Love it.



On Mon, Aug 9, 2010 at 14:47, Devin Meade  wrote:
> ISA 2004 - firewall policy - use the "New server publishing wizard":
> Enter the internal server IP address.
> Make a custom protocol with TCP / outbound / port 22.
> Select "External"
>
> I dont think you want the "Web server publishing wizard" as it requires a
> "listener".   Same goes for the other "new rule" types.
>
> After the wizard is done, you should get a policy like this:
> Name: Whatever you want
> Action: Allow
> Protocols: whatever you named it
> From / Listener: External
> To: Internal IP address
>
> You can add a schedule if you want.  IIRC the wizard got it 90% right, I
> always had to go change one of the parameters to make it work, go figure!  I
> did this quite often with Famatech RAdmin, but we don't use this anymore
>
> Hope this helps, Devin
>
>
> On Mon, Aug 9, 2010 at 3:57 PM, S Powell  wrote:
>>
>> yes it is the first rule.
>>
>>
>> Google.com  Learn it. Live it. Love it.
>>
>>
>>
>> On Mon, Aug 9, 2010 at 12:47, John Cook  wrote:
>> > Did you move that rule to the top?
>> > John W. Cook
>> > Systems Administrator
>> > Partnership for Strong Families
>> >
>> > - Original Message -
>> > From: S Powell 
>> > To: NT System Admin Issues 
>> > Sent: Mon Aug 09 15:39:55 2010
>> > Subject: ssh publishing on ISA
>> >
>> > Hello World!
>> >
>> > I'd be grateful to anyone out there who could give me a hand with this,
>> >
>> > I've got SSH running on a mac (xserve) and I cannot quite figure out
>> > how to publish it via our ISA.
>> >
>> > i've tried a non-web server rule allowing port 22 in and out. and yet
>> > this seems to not work.
>> >
>> > traffic seems to drop and is blocked by the default (enterprise deny
>> > all traffic) rule.
>> >
>> > TIA
>> >
>> >
>> > Google.com  Learn it. Live it. Love it.
>> >
>> > ~ Finally, powerful endpoint security that ISN'T a resource hog! ~
>> > ~   ~
>> >
>> >
>> > CONFIDENTIALITY STATEMENT: The information transmitted, or contained or
>> > attached to or with this Notice is intended only for the person or entity 
>> > to
>> > which it is addressed and may contain Protected Health Information (PHI),
>> > confidential and/or privileged material. Any review, transmission,
>> > dissemination, or other use of, and taking any action in reliance upon this
>> > information by persons or entities other than the intended recipient 
>> > without
>> > the express written consent of the sender are prohibited. This information
>> > may be protected by the Health Insurance Portability and Accountability Act
>> > of 1996 (HIPAA), and other Federal and Florida laws. Improper or
>> > unauthorized use or disclosure of this information could result in civil
>> > and/or criminal penalties.
>> >  Consider the environment. Please don't print this e-mail unless you
>> > really need to.
>> >
>> > ~ Finally, powerful endpoint security that ISN'T a resource hog! ~
>> > ~   ~
>> >
>> >
>>
>> ~ Finally, powerful endpoint security that ISN'T a resource hog! ~
>> ~   ~
>>
>
>
>
>

~ Finally, powerful endpoint security that ISN'T a resource hog! ~
~   ~



Re: ssh publishing on ISA

2010-08-09 Thread Devin Meade
ISA 2004 - firewall policy - use the "New server publishing wizard":
Enter the internal server IP address.
Make a custom protocol with TCP / outbound / port 22.
Select "External"

I dont think you want the "Web server publishing wizard" as it requires a
"listener".   Same goes for the other "new rule" types.

After the wizard is done, you should get a policy like this:
Name: Whatever you want
Action: Allow
Protocols: whatever you named it
>From / Listener: External
To: Internal IP address

You can add a schedule if you want.  IIRC the wizard got it 90% right, I
always had to go change one of the parameters to make it work, go figure!  I
did this quite often with Famatech RAdmin, but we don't use this anymore

Hope this helps, Devin


On Mon, Aug 9, 2010 at 3:57 PM, S Powell  wrote:

> yes it is the first rule.
>
>
> Google.com  Learn it. Live it. Love it.
>
>
>
> On Mon, Aug 9, 2010 at 12:47, John Cook  wrote:
> > Did you move that rule to the top?
> > John W. Cook
> > Systems Administrator
> > Partnership for Strong Families
> >
> > - Original Message -
> > From: S Powell 
> > To: NT System Admin Issues 
> > Sent: Mon Aug 09 15:39:55 2010
> > Subject: ssh publishing on ISA
> >
> > Hello World!
> >
> > I'd be grateful to anyone out there who could give me a hand with this,
> >
> > I've got SSH running on a mac (xserve) and I cannot quite figure out
> > how to publish it via our ISA.
> >
> > i've tried a non-web server rule allowing port 22 in and out. and yet
> > this seems to not work.
> >
> > traffic seems to drop and is blocked by the default (enterprise deny
> > all traffic) rule.
> >
> > TIA
> >
> >
> > Google.com  Learn it. Live it. Love it.
> >
> > ~ Finally, powerful endpoint security that ISN'T a resource hog! ~
> > ~   ~
> >
> >
> > CONFIDENTIALITY STATEMENT: The information transmitted, or contained or
> attached to or with this Notice is intended only for the person or entity to
> which it is addressed and may contain Protected Health Information (PHI),
> confidential and/or privileged material. Any review, transmission,
> dissemination, or other use of, and taking any action in reliance upon this
> information by persons or entities other than the intended recipient without
> the express written consent of the sender are prohibited. This information
> may be protected by the Health Insurance Portability and Accountability Act
> of 1996 (HIPAA), and other Federal and Florida laws. Improper or
> unauthorized use or disclosure of this information could result in civil
> and/or criminal penalties.
> >  Consider the environment. Please don't print this e-mail unless you
> really need to.
> >
> > ~ Finally, powerful endpoint security that ISN'T a resource hog! ~
> > ~   ~
> >
> >
>
> ~ Finally, powerful endpoint security that ISN'T a resource hog! ~
> ~   ~
>
>

~ Finally, powerful endpoint security that ISN'T a resource hog! ~
~   ~

Re: ssh publishing on ISA

2010-08-09 Thread S Powell
yes it is the first rule.


Google.com  Learn it. Live it. Love it.



On Mon, Aug 9, 2010 at 12:47, John Cook  wrote:
> Did you move that rule to the top?
> John W. Cook
> Systems Administrator
> Partnership for Strong Families
>
> - Original Message -
> From: S Powell 
> To: NT System Admin Issues 
> Sent: Mon Aug 09 15:39:55 2010
> Subject: ssh publishing on ISA
>
> Hello World!
>
> I'd be grateful to anyone out there who could give me a hand with this,
>
> I've got SSH running on a mac (xserve) and I cannot quite figure out
> how to publish it via our ISA.
>
> i've tried a non-web server rule allowing port 22 in and out. and yet
> this seems to not work.
>
> traffic seems to drop and is blocked by the default (enterprise deny
> all traffic) rule.
>
> TIA
>
>
> Google.com  Learn it. Live it. Love it.
>
> ~ Finally, powerful endpoint security that ISN'T a resource hog! ~
> ~   ~
>
>
> CONFIDENTIALITY STATEMENT: The information transmitted, or contained or 
> attached to or with this Notice is intended only for the person or entity to 
> which it is addressed and may contain Protected Health Information (PHI), 
> confidential and/or privileged material. Any review, transmission, 
> dissemination, or other use of, and taking any action in reliance upon this 
> information by persons or entities other than the intended recipient without 
> the express written consent of the sender are prohibited. This information 
> may be protected by the Health Insurance Portability and Accountability Act 
> of 1996 (HIPAA), and other Federal and Florida laws. Improper or unauthorized 
> use or disclosure of this information could result in civil and/or criminal 
> penalties.
>  Consider the environment. Please don't print this e-mail unless you really 
> need to.
>
> ~ Finally, powerful endpoint security that ISN'T a resource hog! ~
> ~   ~
>
>

~ Finally, powerful endpoint security that ISN'T a resource hog! ~
~   ~



Re: ssh publishing on ISA

2010-08-09 Thread John Cook
Did you move that rule to the top?
John W. Cook
Systems Administrator
Partnership for Strong Families

- Original Message -
From: S Powell 
To: NT System Admin Issues 
Sent: Mon Aug 09 15:39:55 2010
Subject: ssh publishing on ISA

Hello World!

I'd be grateful to anyone out there who could give me a hand with this,

I've got SSH running on a mac (xserve) and I cannot quite figure out
how to publish it via our ISA.

i've tried a non-web server rule allowing port 22 in and out. and yet
this seems to not work.

traffic seems to drop and is blocked by the default (enterprise deny
all traffic) rule.

TIA


Google.com  Learn it. Live it. Love it.

~ Finally, powerful endpoint security that ISN'T a resource hog! ~
~   ~


CONFIDENTIALITY STATEMENT: The information transmitted, or contained or 
attached to or with this Notice is intended only for the person or entity to 
which it is addressed and may contain Protected Health Information (PHI), 
confidential and/or privileged material. Any review, transmission, 
dissemination, or other use of, and taking any action in reliance upon this 
information by persons or entities other than the intended recipient without 
the express written consent of the sender are prohibited. This information may 
be protected by the Health Insurance Portability and Accountability Act of 1996 
(HIPAA), and other Federal and Florida laws. Improper or unauthorized use or 
disclosure of this information could result in civil and/or criminal penalties.
 Consider the environment. Please don't print this e-mail unless you really 
need to.

~ Finally, powerful endpoint security that ISN'T a resource hog! ~
~   ~



ssh publishing on ISA

2010-08-09 Thread S Powell
Hello World!

I'd be grateful to anyone out there who could give me a hand with this,

I've got SSH running on a mac (xserve) and I cannot quite figure out
how to publish it via our ISA.

i've tried a non-web server rule allowing port 22 in and out. and yet
this seems to not work.

traffic seems to drop and is blocked by the default (enterprise deny
all traffic) rule.

TIA


Google.com  Learn it. Live it. Love it.

~ Finally, powerful endpoint security that ISN'T a resource hog! ~
~   ~



RE: De-duping recommendation

2010-08-09 Thread greg.sweers
Its not available to my knowledge in any other product but Storage Server and 
it is in use with their deployment technologies but that's it.

Greg

-Original Message-
From: Matthew W. Ross [mailto:mr...@ephrataschools.org] 
Sent: Monday, August 09, 2010 1:23 PM
To: NT System Admin Issues
Subject: Re: De-duping recommendation

Is there a way to do deduplication on the Windows Server level? I see Microsoft 
has a "Single instance Storage" for Windows Storage Server 2008, but it doesn't 
appear to be an available feature for any of their other Server products.

Any 3rd party DeDup on Windows support out there?


--Matt Ross
Ephrata School District


- Original Message -
From: Lists - Level5
[mailto:li...@levelfive.us]
To: NT System Admin Issues
[mailto:ntsysad...@lyris.sunbelt-software.com]
Sent: Sun, 08 Aug 2010
12:54:26 -0700
Subject: De-duping recommendation


> We are being approached by Dell to get a few of their Data Domain san units.
> We currently have 4 6TB EQL's and are running a 9TB/wk backup rotation. The
> owners have requested to have all data backed up and readily available for 7
> years. Nothing on tape.
> 
>  
> 
> I heard that NetApp is one of the leaders in the de-dupe space but no idea
> for certain. We are using Symantec BEX 2010 Enterprise currently, and Veeam
> enterprise for our VM's (which are not part of the 9tb data backups
> currently).
> 
>  
> 
> It looks like we will basically be backing up to the unit until its full
> (Dell says 30:1 or so) and then purchase a new one and so on and so on until
> someone thinks they spent too much. 
> 
>  
> 
> Are there any other ideas out there?
> 
>  
> 
> Thanks
> 
>  
> 
> 
> ~ Finally, powerful endpoint security that ISN'T a resource hog! ~
> ~   ~

~ Finally, powerful endpoint security that ISN'T a resource hog! ~
~   ~


~ Finally, powerful endpoint security that ISN'T a resource hog! ~
~   ~



Re: Cannot delete a PTR record, AD integrated DNS

2010-08-09 Thread mb
That seems to have worked, Coleman, thank you.  Needed to have a '.' after 
'arpa' in the command, but the link made that clear.  For anyone's reference, 
the IP on this reverse record was 10.1.1.101, and the command I used to smoke 
it was:

dnscmd /RecordDelete 10.in-addr.arpa. 101.1.1 PTR


Appreciate the assist.




From: Coleman, Hunter 
Sent: Thursday, August 05, 2010 11:17 PM
To: NT System Admin Issues 
Subject: RE: Cannot delete a PTR record, AD integrated DNS


The hotfix only prevents new PTR records from getting created with capital 
letters in the host name. Existing records with that affliction can only be 
deleted with dnscmd.exe, IIRC.

 

From: Brian Desmond [mailto:br...@briandesmond.com] 
Sent: Thursday, August 05, 2010 5:17 PM
To: NT System Admin Issues
Subject: RE: Cannot delete a PTR record, AD integrated DNS

 

The KB he linked should be rolled in to 2003 SP1 based on the date. 

 

Thanks,

Brian Desmond

br...@briandesmond.com

 

c   - 312.731.3132

 

From: Sean Martin [mailto:seanmarti...@gmail.com] 
Sent: Thursday, August 05, 2010 6:05 PM
To: NT System Admin Issues
Subject: Re: Cannot delete a PTR record, AD integrated DNS

 

I think Hunter is on the right track. I seem to recall having to run through a 
similar process for a similar issue.

 

- Sean

On Thu, Aug 5, 2010 at 1:53 PM, Brian Desmond  wrote:

So is the record on all your DCs or just one? Are you sure the reverse zone is 
replicating in the ForestDnsZones NDNC?

What I would suggest doing is turning on auditing for this subtree in AD and 
enabling DS Access auditing and then you can figure out what's causing it to 
get created.


Thanks,
Brian Desmond
br...@briandesmond.com

c   - 312.731.3132

-Original Message-
From: mb [mailto:midphan12...@gmail.com]

Sent: Thursday, August 05, 2010 4:25 PM
To: NT System Admin Issues

Subject: Re: Cannot delete a PTR record, AD integrated DNS

There is no corresponding A record for this PTR record.  There is however a 
different machine at that IP with A & PTR records, and this ghost PTR record is 
causing a little bit of grief to the folks that manage this other system.
The A record that originally existed for this ghost PTR record, that's been 
gone a couple of years at least.

Was looking for zone files just on a hunch.  I do understand that being AD 
integrated, this is stored in AD, but in my original note I mentioned that I 
used ADSIEdit to look in ForestDNSZones, and this ghost PTR record does not 
exist there.  So it's somehow local to any domain controller (because it 
reappears faster than it could be replicating back), and it's not where it 
should be within the AD database.

I'm missing something.


--
From: "Brian Desmond" 
Sent: Thursday, August 05, 2010 4:09 PM
To: "NT System Admin Issues" 
Subject: RE: Cannot delete a PTR record, AD integrated DNS

> There are no zone files there because your zones are stored in AD.
>
> What's the corresponding A record for this represent?
>
> Thanks,
> Brian Desmond
> br...@briandesmond.com
>
> c   - 312.731.3132
>
>

> -Original Message-
> From: mb [mailto:midphan12...@gmail.com]

> Sent: Thursday, August 05, 2010 4:07 PM
> To: NT System Admin Issues

> Subject: Re: Cannot delete a PTR record, AD integrated DNS
>
> This is interesting.
>
> Checked \system32\dns on a few of our domain controllers, I'm not
> finding any zone files with any data in them.  I haven't checked all
> the domain controllers.  One thing though - on any DC, if I delete
> this record and then immediately refresh the zone, that record is
> right there again, like it's coming from something local or I didn't
> actually delete the record (though I'm not seeing any kind of error dialogue).
>
> Checked properties on this record.  There's no timestamp, it's a
> static record.  I suppose that means it could never become stale -
> thought about trying the "Delete this record when it becomes stale"
> checkbox.  Just because I've tried everything I know that makes sense.
>
> I could interrupt DHCP if I do it late on a weekend night.  And it's
> worth a try.  But I just keep going back to the fact that this record
> reappears instantly, as fast as I can delete/refresh, that record is
> there, on any domain controller (all our DC's are running DNS).  So
> I'm thinking this isn't replicating from another DC or being
> dynamically created from a DHCP server.
>
>
> --
> From: "Ben Scott" 
> Sent: Thursday, August 05, 2010 2:00 PM
> To: "NT System Admin Issues" 
> Subject: Re: Cannot delete a PTR record, AD integrated DNS
>
>> On Thu, Aug 5, 2010 at 2:38 PM, mb  wrote:

>>> I've tried through ADSIEdit,
>>> and interestingly, this record does not exist there.  It does show
>>> up in the DNS console as a 'static' record, but I'm at a loss where
>>> it's coming from.
>>

>>  Check %SystemRoot%\system32\dns\ for any files which might contain
>> the offending recor

RE: OT: Vipre effectiveness & false positives

2010-08-09 Thread RichardMcClary
It appears to be at the top for "Proactive" (rather than simply 
"Reactive") - wow!

Alex Eckelberry  wrote on 08/09/2010 11:53:21 
AM:

> Fwiw, VIPRE just made #1 in proactive detection in the latest 
> VirusBulletin test:
> 
> http://www.virusbtn.com/vb100/rap-index.xml
> 
> [image removed] 
> 
> 
> 
> Alex
> 
> 
> 
> From: Ralph Smith [mailto:m...@gatewayindustries.org] 
> Sent: Thursday, August 05, 2010 2:57 PM
> To: NT System Admin Issues
> Subject: RE: OT: Vipre effectiveness & false positives
> 
> You?re right.  The best approach is to make a decision based on what
> you want to be true, and then stick to it no matter what, 
> disregarding any information that might be troubling.
> Thanks for the enlightenment -)
> 
> Actually, Alex sent me some data that shows detection results for 
> over 40 products including VIPRE, and while VIPRE wasn?t always at 
> the top of the list for any given date or type of threat, it often 
> was and made a very good showing overall.
> Of course, anyone may interpret that data as they wish, but I was 
> satisfied by what it showed. 
> 
> Ralph
> 
> From: andy [mailto:afo...@psu.edu] 
> Sent: Thursday, August 05, 2010 11:59 AM
> To: NT System Admin Issues
> Subject: RE: OT: Vipre effectiveness & false positives
> 
> all data is used to indicate what you want it to show.
> 
> 
> At 09:52 PM 7/29/2010, Ralph Smith wrote:
> Willlburrr!...
> 
> From: Michael B. Smith [ mailto:mich...@smithcons.com] 
> Sent: Thursday, July 29, 2010 8:53 PM
> To: NT System Admin Issues
> Subject: RE: OT: Vipre effectiveness & false positives
> 
> Not if his name is Mr. Ed. :-)
> 
> Sent from my HTC Tilt? 2, a Windows® phone from AT&T
> 
> From: Ralph Smith 
> Sent: Thursday, July 29, 2010 8:49 PM
> To: NT System Admin Issues 
> Subject: RE: Vipre effectiveness & false positives
> 
> I don't disagree, but when you are presented with information you 
> have to evaluate the validity of the data, and hopefully get 
> clarification from those involved when it implies that there may be 
> a problem.  Virus Bulletin actually warned in the explanation of the
> chart that it was just one result and that conclusions shouldn't be 
> jumped to until there was more data. 
> 
> And sometimes, a horse is just a horse, of course.
> 
> 
> 
> From: Kim Longenbaugh [ mailto:k...@colonialsavings.com] 
> Sent: Thursday, July 29, 2010 4:39 PM
> To: NT System Admin Issues
> Subject: RE: Vipre effectiveness & false positives
> 
> My point was really that all AV vendors have experience FPs, not just 
Vipre.
> 
> 
> 
> I agree that statistics can be a valuable tool, it?s just that which
> ones you choose and how you present them can be misleading.  For 
> example, in a horse race between the US and Russia, the US horse 
> won.  In the American papers, it was reported that the US was took 
> first place.  In the Russian papers, it was reported that the US was
> next to last and that Russia was second place.  The statistics 
> reported in both cases were true, but the picture they gave of the 
> race was very different.
> 
> 
> 
> From: Ralph Smith [ mailto:m...@gatewayindustries.org] 
> Sent: Thursday, July 29, 2010 3:08 PM
> To: NT System Admin Issues
> Subject: RE: Vipre effectiveness & false positives
> 
> 
> 
> True, but there were people on the VIPRE forum that were hit just as
> hard by a couple of the FPs that VIPRE had.  I?m not knocking VIPRE 
> at all ? I like it a lot and would purchase it again with no hesitation.
> 
> 
> 
> However, when a well known organization like Virus Bulletin 
> publishes test results, it makes sense to look at the data and try 
> to understand what it means and how it may impact your organization.
> I personally feel confident with Sunbelt, but I would be interested 
> to understand how they interpret the chart and what they feel the 
> implications are for their product.
> 
> 
> 
> By the way, some lies may be statistics, but not all statistics are 
> lies.  Information, including statistical, is the basis for sound 
> decision making.
> 
> 
> 
> From: Kim Longenbaugh [ mailto:k...@colonialsavings.com] 
> Sent: Thursday, July 29, 2010 2:28 PM
> To: NT System Admin Issues
> Subject: RE: Vipre effectiveness & false positives
> 
> 
> 
> How about a little perspective on false positives?
> 
> 
> 
> http://news.cnet.com/8301-1009_3-20003074-83.html
> 
> 
> 
> and a reminder about statistics from Mark Twain:
> 
> ?there?s 3 kinds of lies: lies, damned lies, and statistics?
> 
> 
> 
> 
> 
> From: Ralph Smith [ mailto:m...@gatewayindustries.org] 
> Sent: Thursday, July 29, 2010 1:20 PM
> To: NT System Admin Issues
> Subject: RE: Vipre effectiveness & false positives
> 
> 
> 
> I?ve had VIPRE for a couple of years now, and was fortunately not 
> hit hard with the false positive problems others have had.  With 
> about 180 Win XP machines, I?ve had only a half dozen infections in 
> that time ? all but one of the rogue AV kind, so I have been feeling
> pretty good.
> 
> 
> 
> However, the c

Re: De-duping recommendation

2010-08-09 Thread Matthew W. Ross
Is there a way to do deduplication on the Windows Server level? I see Microsoft 
has a "Single instance Storage" for Windows Storage Server 2008, but it doesn't 
appear to be an available feature for any of their other Server products.

Any 3rd party DeDup on Windows support out there?


--Matt Ross
Ephrata School District


- Original Message -
From: Lists - Level5
[mailto:li...@levelfive.us]
To: NT System Admin Issues
[mailto:ntsysad...@lyris.sunbelt-software.com]
Sent: Sun, 08 Aug 2010
12:54:26 -0700
Subject: De-duping recommendation


> We are being approached by Dell to get a few of their Data Domain san units.
> We currently have 4 6TB EQL's and are running a 9TB/wk backup rotation. The
> owners have requested to have all data backed up and readily available for 7
> years. Nothing on tape.
> 
>  
> 
> I heard that NetApp is one of the leaders in the de-dupe space but no idea
> for certain. We are using Symantec BEX 2010 Enterprise currently, and Veeam
> enterprise for our VM's (which are not part of the 9tb data backups
> currently).
> 
>  
> 
> It looks like we will basically be backing up to the unit until its full
> (Dell says 30:1 or so) and then purchase a new one and so on and so on until
> someone thinks they spent too much. 
> 
>  
> 
> Are there any other ideas out there?
> 
>  
> 
> Thanks
> 
>  
> 
> 
> ~ Finally, powerful endpoint security that ISN'T a resource hog! ~
> ~   ~

~ Finally, powerful endpoint security that ISN'T a resource hog! ~
~   ~



Re: De-duping recommendation

2010-08-09 Thread Kevin Lundy
For those negotiating with EMC ... find the comparable Netapp solution.
Start mentioning that your requisition will be competitive.  EMC does not
like NetApp and will get pretty agressive on pricing if they know they are
up against NetApp.




On Mon, Aug 9, 2010 at 11:04 AM, Rob Bonfiglio wrote:

> I should have been clearer.  It's not so much a problem with the Quantum,
> as it is that it wasn't speced out properly.  They knew it wasn't going to
> be able to meet future expansion needs as it was ordered, but they needed a
> solution in place quickly, and it was what they could afford at the time.
>
> I'm not the one who runs the backups here, but my understanding is that the
> backups are not able to finish before the process to dedupe the data
> begins.  So the data is deduped, and then has to be re-hydrated so it can be
> dumped onto tape.  This drastically increased the amount of time to complete
> a backup to tape.
>
> Supposedly the answer to this is to add another shelf of disks.  Adding
> another shelf would bump us up from 11TB to 22TB of dedupe space.  Our end
> goal is to have 50TB of storage, which is what we were using to compare with
> the Data Domain.  We were comparing a new 50TB implementation of Data Domain
> with the cost of upgrading the current Quantum implementation to 55TB.
>
> The other part of the comparison, which played into the pricing, was the
> replication of data from our remote locations to our new HQ building.  Data
> domain was willing to work with us through the use of an extended "trial"
> period, where Quantum was not.
>
> Hope that answers your questions, and clarifies my earlier statements.
> Like I said, most of the decision was based on pricing.
>
>
>
>
>  On Mon, Aug 9, 2010 at 7:56 AM, Eldridge, Dave wrote:
>
>>   Rob if you don’t mind can you expand on your issues with your Quantum
>> device? I have 2 7500dxi’s and they are functioning perfectly.
>>
>> thanks
>>
>>
>>
>> *From:* Rob Bonfiglio [mailto:robbonfig...@gmail.com]
>> *Sent:* Sunday, August 08, 2010 2:10 PM
>>
>> *To:* NT System Admin Issues
>> *Subject:* Re: De-duping recommendation
>>
>>
>>
>> We're purchasing Data Domain.  Looks like a great product.  We're
>> currently using Quantum's dedupe product, but it is not meeting our needs.
>> To upgrade it to the comparable Data Domain equivalent would be more
>> expensive than ripping out Quantum and going with Data Domain.  The cost for
>> us was a huge factor.  We are also going  through a consolidation.  We are
>> going to use Data Domain to dedupe and then replicate the data from our
>> remote sites  to our new HQ.  (Quantum can do this as well.)
>>
>>
>>
>> Sorry I can't give  you an current experience with it.  My understanding
>> though, is that Data Domain is the best in the business (and this actually
>> comes from the engineer of the company who is trying to sell us the Quantum
>> solution.)
>>
>> On Sun, Aug 8, 2010 at 3:54 PM, Lists - Level5 
>> wrote:
>>
>> We are being approached by Dell to get a few of their Data Domain san
>> units. We currently have 4 6TB EQL’s and are running a 9TB/wk backup
>> rotation. The owners have requested to have all data backed up and readily
>> available for 7 years. Nothing on tape.
>>
>>
>>
>> I heard that NetApp is one of the leaders in the de-dupe space but no idea
>> for certain. We are using Symantec BEX 2010 Enterprise currently, and Veeam
>> enterprise for our VM’s (which are not part of the 9tb data backups
>> currently).
>>
>>
>>
>> It looks like we will basically be backing up to the unit until its full
>> (Dell says 30:1 or so) and then purchase a new one and so on and so on until
>> someone thinks they spent too much.
>>
>>
>>
>> Are there any other ideas out there?
>>
>>
>>
>> Thanks
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> This e-mail contains the thoughts and opinions of the sender and does not
>> represent official Parkview Medical Center policy.
>>
>> This communication is intended only for the recipient(s) named above, may
>> be confidential and/or legally privileged: and, must be treated as such in
>> accordance with state and federal laws. If you are not the intended
>> recipient, you are hereby notified that any use of this communication, or
>> any of its contents, is prohibited. If you have received this communication
>> in error, please return to sender and delete the message from your computer
>> system.{token}
>>
>>
>>
>>
>>
>>
>
>
>
>
>

~ Finally, powerful endpoint security that ISN'T a resource hog! ~
~   ~

Re: De-duping recommendation

2010-08-09 Thread Rob Bonfiglio
I should have been clearer.  It's not so much a problem with the Quantum, as
it is that it wasn't speced out properly.  They knew it wasn't going to be
able to meet future expansion needs as it was ordered, but they needed a
solution in place quickly, and it was what they could afford at the time.

I'm not the one who runs the backups here, but my understanding is that the
backups are not able to finish before the process to dedupe the data
begins.  So the data is deduped, and then has to be re-hydrated so it can be
dumped onto tape.  This drastically increased the amount of time to complete
a backup to tape.

Supposedly the answer to this is to add another shelf of disks.  Adding
another shelf would bump us up from 11TB to 22TB of dedupe space.  Our end
goal is to have 50TB of storage, which is what we were using to compare with
the Data Domain.  We were comparing a new 50TB implementation of Data Domain
with the cost of upgrading the current Quantum implementation to 55TB.

The other part of the comparison, which played into the pricing, was the
replication of data from our remote locations to our new HQ building.  Data
domain was willing to work with us through the use of an extended "trial"
period, where Quantum was not.

Hope that answers your questions, and clarifies my earlier statements.  Like
I said, most of the decision was based on pricing.




On Mon, Aug 9, 2010 at 7:56 AM, Eldridge, Dave  wrote:

>  Rob if you don’t mind can you expand on your issues with your Quantum
> device? I have 2 7500dxi’s and they are functioning perfectly.
>
> thanks
>
>
>
> *From:* Rob Bonfiglio [mailto:robbonfig...@gmail.com]
> *Sent:* Sunday, August 08, 2010 2:10 PM
>
> *To:* NT System Admin Issues
> *Subject:* Re: De-duping recommendation
>
>
>
> We're purchasing Data Domain.  Looks like a great product.  We're currently
> using Quantum's dedupe product, but it is not meeting our needs.  To upgrade
> it to the comparable Data Domain equivalent would be more expensive than
> ripping out Quantum and going with Data Domain.  The cost for us was a huge
> factor.  We are also going  through a consolidation.  We are going to use
> Data Domain to dedupe and then replicate the data from our remote sites  to
> our new HQ.  (Quantum can do this as well.)
>
>
>
> Sorry I can't give  you an current experience with it.  My understanding
> though, is that Data Domain is the best in the business (and this actually
> comes from the engineer of the company who is trying to sell us the Quantum
> solution.)
>
> On Sun, Aug 8, 2010 at 3:54 PM, Lists - Level5  wrote:
>
> We are being approached by Dell to get a few of their Data Domain san
> units. We currently have 4 6TB EQL’s and are running a 9TB/wk backup
> rotation. The owners have requested to have all data backed up and readily
> available for 7 years. Nothing on tape.
>
>
>
> I heard that NetApp is one of the leaders in the de-dupe space but no idea
> for certain. We are using Symantec BEX 2010 Enterprise currently, and Veeam
> enterprise for our VM’s (which are not part of the 9tb data backups
> currently).
>
>
>
> It looks like we will basically be backing up to the unit until its full
> (Dell says 30:1 or so) and then purchase a new one and so on and so on until
> someone thinks they spent too much.
>
>
>
> Are there any other ideas out there?
>
>
>
> Thanks
>
>
>
>
>
>
>
>
>
>
>
>
>
> This e-mail contains the thoughts and opinions of the sender and does not
> represent official Parkview Medical Center policy.
>
> This communication is intended only for the recipient(s) named above, may
> be confidential and/or legally privileged: and, must be treated as such in
> accordance with state and federal laws. If you are not the intended
> recipient, you are hereby notified that any use of this communication, or
> any of its contents, is prohibited. If you have received this communication
> in error, please return to sender and delete the message from your computer
> system.{token}
>
>
>
>
>
>

~ Finally, powerful endpoint security that ISN'T a resource hog! ~
~   ~

RE: Anyone Using Nagios?

2010-08-09 Thread Jason Gauthier
I use both Nagios and Cacti.  The only area about Nagios that I would
like more is trap management.  Currently, you need to implement that
process yourself and glue it together.

 

I like Nagois.  Over the years,  (I've used it for half a dozen years -
maybe more), I've looked at other software.  Nothing beats the
price/functionality/ease of use combination.

 

 

From: Robert Jackson [mailto:r...@walkermartyn.co.uk] 
Sent: Thursday, August 05, 2010 2:00 AM
To: NT System Admin Issues
Subject: Anyone Using Nagios?

 

I'm looking at setting up a Solaris 10 (x86) Nagios server. The purpose
is to monitoring server, services and networking information. My problem
is I can't decide on a graphing solution that will allow me to view
trending information. Anyone have any ideas for the best graphing
solution for Nagios?

TIA.




The information in this internet E-mail is confidential and is intended
solely for the addressee. Access, copying or re-use of information in it
by anyone else is unauthorised. Any views or opinions presented are
solely those of the author and do not necessarily represent those of
Walker Martyn Ltd or any of its affiliates. If you are not the intended
recipient please contact administra...@walkermartyn.co.uk.

Walker Martyn Ltd, company number SC197533. Company is registered in
Scotland and has its registered office at 1 Park Circus Place, Glasgow
G3 6AH, UK.

 

 

 

~ Finally, powerful endpoint security that ISN'T a resource hog! ~
~   ~

RE: De-duping recommendation

2010-08-09 Thread Chyka, Robert
We are using Exagrid also.  I think they might be a little higher priced
but I have had their solution in for over a year at 2 locations and the
product just works.  I set it up once and haven't touched it since.  We
dedup at one site and then send it across our WAN to another Exagrid box
for disaster recovery.  Works great!

-Original Message-
From: John Hornbuckle [mailto:john.hornbuc...@taylor.k12.fl.us] 
Sent: Monday, August 09, 2010 9:05 AM
To: NT System Admin Issues
Subject: RE: De-duping recommendation

I can second ExaGrid; we're using their solution here, and have had no
problems.

I don't know how it compares with alternatives in terms of price,
though.



John Hornbuckle
MIS Department
Taylor County School District
www.taylor.k12.fl.us





-Original Message-
From: Chyka, Robert [mailto:bch...@medaille.edu] 
Sent: Sunday, August 08, 2010 4:34 PM
To: NT System Admin Issues
Subject: RE: De-duping recommendation

Exagrid!  And you won't get 30 to 1... Probably 15 to 1 depending upon
the type of data being backed up.

-Original Message-
From: Lists - Level5 
Sent: Sunday, August 08, 2010 3:52 PM
To: NT System Admin Issues 
Subject: De-duping recommendation

We are being approached by Dell to get a few of their Data Domain san
units.
We currently have 4 6TB EQL's and are running a 9TB/wk backup rotation.
The owners have requested to have all data backed up and readily
available for 7 years. Nothing on tape.

 

I heard that NetApp is one of the leaders in the de-dupe space but no
idea for certain. We are using Symantec BEX 2010 Enterprise currently,
and Veeam enterprise for our VM's (which are not part of the 9tb data
backups currently).

 

It looks like we will basically be backing up to the unit until its full
(Dell says 30:1 or so) and then purchase a new one and so on and so on
until someone thinks they spent too much. 

 

Are there any other ideas out there?

 

Thanks

 


~ Finally, powerful endpoint security that ISN'T a resource hog! ~ ~
  ~

~ Finally, powerful endpoint security that ISN'T a resource hog! ~ ~
  ~



NOTICE: Florida has a broad public records law. Most written
communications to or from this entity are public records that will be
disclosed to the public and the media upon request. E-mail
communications may be subject to public disclosure.


~ Finally, powerful endpoint security that ISN'T a resource hog! ~
~   ~


~ Finally, powerful endpoint security that ISN'T a resource hog! ~
~   ~



RE: De-duping recommendation

2010-08-09 Thread John Hornbuckle
I can second ExaGrid; we're using their solution here, and have had no problems.

I don't know how it compares with alternatives in terms of price, though.



John Hornbuckle
MIS Department
Taylor County School District
www.taylor.k12.fl.us





-Original Message-
From: Chyka, Robert [mailto:bch...@medaille.edu] 
Sent: Sunday, August 08, 2010 4:34 PM
To: NT System Admin Issues
Subject: RE: De-duping recommendation

Exagrid!  And you won't get 30 to 1... Probably 15 to 1 depending upon the type 
of data being backed up.

-Original Message-
From: Lists - Level5 
Sent: Sunday, August 08, 2010 3:52 PM
To: NT System Admin Issues 
Subject: De-duping recommendation

We are being approached by Dell to get a few of their Data Domain san units.
We currently have 4 6TB EQL's and are running a 9TB/wk backup rotation. The 
owners have requested to have all data backed up and readily available for 7 
years. Nothing on tape.

 

I heard that NetApp is one of the leaders in the de-dupe space but no idea for 
certain. We are using Symantec BEX 2010 Enterprise currently, and Veeam 
enterprise for our VM's (which are not part of the 9tb data backups currently).

 

It looks like we will basically be backing up to the unit until its full (Dell 
says 30:1 or so) and then purchase a new one and so on and so on until someone 
thinks they spent too much. 

 

Are there any other ideas out there?

 

Thanks

 


~ Finally, powerful endpoint security that ISN'T a resource hog! ~ ~ 
  ~

~ Finally, powerful endpoint security that ISN'T a resource hog! ~ ~ 
  ~



NOTICE: Florida has a broad public records law. Most written communications to 
or from this entity are public records that will be disclosed to the public and 
the media upon request. E-mail communications may be subject to public 
disclosure.


~ Finally, powerful endpoint security that ISN'T a resource hog! ~
~   ~



RE: De-duping recommendation

2010-08-09 Thread Lists - Level5
Thanks Kevin for some insight. Our DD will actually be in our datacenter
where about 80% of our servers are now, and we already do replication
SAN-SAN every 30 mins, so we wont be replicating the DD unit, but we will be
adding more and more. My biggest concern was managing the Symantec BEX for
the years to come . heh

 

From: Kevin Lundy [mailto:klu...@gmail.com] 
Sent: Monday, August 09, 2010 8:29 AM
To: NT System Admin Issues
Subject: Re: De-duping recommendation

 

You will have to pry our Data Domains from our cold stiff hands :)

 

So far, we have written 281 TB worth of backups to only 10 TB of disk.  The
dedupe/compression varies significantly by data set.  It also grows over
time.  Our VM backups using vRanger are getting close to 70x dedupe, and
growing.  Some of our visual databases only get about 10x.

 

With two units, replication is simple, and comes post dedupe/compression.
So saves bandwidth.

 

One recommendation is not to buy via Dell.  EMC support through Dell is not
good from what I hear.  I was at a EMC customer council 2 years ago, and
that was the number 1 complaint.  But then again, we haven't needed support
on the DD yet.  I do recommend purchasing installation assistance.  Not that
it is difficult, but the paradigm is a bit different.

On Sun, Aug 8, 2010 at 3:54 PM, Lists - Level5  wrote:

We are being approached by Dell to get a few of their Data Domain san units.
We currently have 4 6TB EQL's and are running a 9TB/wk backup rotation. The
owners have requested to have all data backed up and readily available for 7
years. Nothing on tape.

 

I heard that NetApp is one of the leaders in the de-dupe space but no idea
for certain. We are using Symantec BEX 2010 Enterprise currently, and Veeam
enterprise for our VM's (which are not part of the 9tb data backups
currently).

 

It looks like we will basically be backing up to the unit until its full
(Dell says 30:1 or so) and then purchase a new one and so on and so on until
someone thinks they spent too much. 

 

Are there any other ideas out there?

 

Thanks

 

 

 

 

 

 

~ Finally, powerful endpoint security that ISN'T a resource hog! ~
~   ~

Re: De-duping recommendation

2010-08-09 Thread Kevin Lundy
You will have to pry our Data Domains from our cold stiff hands :)

So far, we have written 281 TB worth of backups to only 10 TB of disk.  The
dedupe/compression varies significantly by data set.  It also grows over
time.  Our VM backups using vRanger are getting close to 70x dedupe, and
growing.  Some of our visual databases only get about 10x.

With two units, replication is simple, and comes post dedupe/compression.
So saves bandwidth.

One recommendation is not to buy via Dell.  EMC support through Dell is not
good from what I hear.  I was at a EMC customer council 2 years ago, and
that was the number 1 complaint.  But then again, we haven't needed support
on the DD yet.  I do recommend purchasing installation assistance.  Not that
it is difficult, but the paradigm is a bit different.

On Sun, Aug 8, 2010 at 3:54 PM, Lists - Level5  wrote:

>  We are being approached by Dell to get a few of their Data Domain san
> units. We currently have 4 6TB EQL’s and are running a 9TB/wk backup
> rotation. The owners have requested to have all data backed up and readily
> available for 7 years. Nothing on tape.
>
>
>
> I heard that NetApp is one of the leaders in the de-dupe space but no idea
> for certain. We are using Symantec BEX 2010 Enterprise currently, and Veeam
> enterprise for our VM’s (which are not part of the 9tb data backups
> currently).
>
>
>
> It looks like we will basically be backing up to the unit until its full
> (Dell says 30:1 or so) and then purchase a new one and so on and so on until
> someone thinks they spent too much.
>
>
>
> Are there any other ideas out there?
>
>
>
> Thanks
>
>
>
>
>
>
>
>

~ Finally, powerful endpoint security that ISN'T a resource hog! ~
~   ~

RE: Massive Patch Tuesday

2010-08-09 Thread Maglinger, Paul
May the schwartz be with you.

-Original Message-
From: Kurt Buff [mailto:kurt.b...@gmail.com] 
Sent: Friday, August 06, 2010 6:11 PM
To: NT System Admin Issues
Subject: Re: Massive Patch Tuesday

On Fri, Aug 6, 2010 at 15:16, Ben Scott  wrote:
> On Fri, Aug 6, 2010 at 5:32 PM, Andrew S. Baker  wrote:
>> Back in action, you are.
>
>  Talk like Yoda, you do.  ;-)
>
> -- Ben

The farce is strong in this one...

~ Finally, powerful endpoint security that ISN'T a resource hog! ~
~   ~


~ Finally, powerful endpoint security that ISN'T a resource hog! ~
~   ~

RE: Friday OT: Man's best friend?

2010-08-09 Thread Maglinger, Paul
Hmmm... Where'd my breakfast go?  Oh, never mind... here it is... all over my 
shoes...

-Original Message-
From: Kurt Buff [mailto:kurt.b...@gmail.com] 
Sent: Friday, August 06, 2010 6:09 PM
To: NT System Admin Issues
Subject: Friday OT: Man's best friend?

I am at a loss for words on this one

http://www.mlive.com/news/grand-rapids/index.ssf/2010/08/dog_eats_rockford_mans_big_toe.html

~ Finally, powerful endpoint security that ISN'T a resource hog! ~
~   ~

~ Finally, powerful endpoint security that ISN'T a resource hog! ~
~   ~

RE: De-duping recommendation

2010-08-09 Thread Eldridge, Dave
Rob if you don't mind can you expand on your issues with your Quantum
device? I have 2 7500dxi's and they are functioning perfectly.

thanks

 

From: Rob Bonfiglio [mailto:robbonfig...@gmail.com] 
Sent: Sunday, August 08, 2010 2:10 PM
To: NT System Admin Issues
Subject: Re: De-duping recommendation

 

We're purchasing Data Domain.  Looks like a great product.  We're
currently using Quantum's dedupe product, but it is not meeting our
needs.  To upgrade it to the comparable Data Domain equivalent would be
more expensive than ripping out Quantum and going with Data Domain.  The
cost for us was a huge factor.  We are also going  through a
consolidation.  We are going to use Data Domain to dedupe and then
replicate the data from our remote sites  to our new HQ.  (Quantum can
do this as well.)

 

Sorry I can't give  you an current experience with it.  My understanding
though, is that Data Domain is the best in the business (and this
actually comes from the engineer of the company who is trying to sell us
the Quantum solution.)

On Sun, Aug 8, 2010 at 3:54 PM, Lists - Level5 
wrote:

We are being approached by Dell to get a few of their Data Domain san
units. We currently have 4 6TB EQL's and are running a 9TB/wk backup
rotation. The owners have requested to have all data backed up and
readily available for 7 years. Nothing on tape.

 

I heard that NetApp is one of the leaders in the de-dupe space but no
idea for certain. We are using Symantec BEX 2010 Enterprise currently,
and Veeam enterprise for our VM's (which are not part of the 9tb data
backups currently).

 

It looks like we will basically be backing up to the unit until its full
(Dell says 30:1 or so) and then purchase a new one and so on and so on
until someone thinks they spent too much. 

 

Are there any other ideas out there?

 

Thanks

 

 

 

 

 

 



This message contains confidential information and is intended only for the 
intended recipient(s). If you are not the named recipient you should not read, 
distribute or copy this e-mail. Please notify the sender immediately via e-mail 
if you have received this e-mail by mistake; then, delete this e-mail from your 
system.
~ Finally, powerful endpoint security that ISN'T a resource hog! ~
~   ~

RE: De-duping recommendation

2010-08-09 Thread Erik Goldoff
“(Dell says 30:1 or so)”

 

Did they say 30:1 or “UP TO” 30:1 ?

(remembering that UP TO starts with ZERO)

 

Erik Goldoff

IT  Consultant

Systems, Networks, & Security 

'  Security is an ongoing process, not a one time event ! '

From: Lists - Level5 [mailto:li...@levelfive.us] 
Sent: Sunday, August 08, 2010 3:54 PM
To: NT System Admin Issues
Subject: De-duping recommendation

 

We are being approached by Dell to get a few of their Data Domain san units.
We currently have 4 6TB EQL’s and are running a 9TB/wk backup rotation. The
owners have requested to have all data backed up and readily available for 7
years. Nothing on tape.

 

I heard that NetApp is one of the leaders in the de-dupe space but no idea
for certain. We are using Symantec BEX 2010 Enterprise currently, and Veeam
enterprise for our VM’s (which are not part of the 9tb data backups
currently).

 

It looks like we will basically be backing up to the unit until its full
(Dell says 30:1 or so) and then purchase a new one and so on and so on until
someone thinks they spent too much. 

 

Are there any other ideas out there?

 

Thanks

 

 

 

~ Finally, powerful endpoint security that ISN'T a resource hog! ~
~   ~