Re: Rugged wireless bridge
Blackbox also has some offerings in this area, made to order http://www.blackbox.com/solutions/infrastructure/Wireless-Bridges.aspx -henry From: Andrey Khomyakov khomyakov.and...@gmail.com To: Nanog nanog@nanog.org Sent: Wed, May 12, 2010 4:23:24 PM Subject: Re: Rugged wireless bridge I found this sucker so far, I guess it has to be waterproof rather than just rugged. http://www.korenixsecurity.com/products/weatherproof-ethernet-switch/jetnet-3706-rj On May 12, 2010, at 7:11 PM, Mike Lyon wrote: Not sure how outdoor-worthy those guys are... -Mike On Wed, May 12, 2010 at 4:04 PM, Seth Mattinen se...@rollernet.us wrote: On 5/12/2010 15:53, Andrey Khomyakov wrote: Hi all again Thanks for all the links. Lots of wifi solutions. The main problem I'm facing is the fact that I need more than one copper ethernet connection at those outdoor locations. Meaning that I'll have at least two or three IP cameras (PoE desired) and a automatic security gate. So, I feel like some outdoor rated copper ethernet switches with PoE is the hardest part here. Anyone came across any 5-8 port PoE switches that can be put outdoors? Like these? http://www.bb-elec.com/product_multi_family.asp?MultiFamilyId=90Trail=1TrailType=Top The EIRP305-T and EIRP610-2SFP-T have POE. ~Seth
whois.rpki.net
ruediger-style pseudo irr route: objects from the rpki testbed are available from whois.rpki.net rmac.psg.com:/Users/randy whois -h whois.rpki.net 98.128.0.0/16 route: 98.128.0.0/16 descr: 98.128.0.0/16-24 origin: AS3130 notify: irr-h...@rpki.net mnt-by: MAINT-RPKI changed:irr-h...@rpki.net 20100323 source: RPKI this gives testbed players an easy way to see the state of their valid roas, as well as providing a path for using the rpki as an irr overlay (credit to ruediger). randy
POE switches and lightning
We had a lightning strike nearby yesterday that looks to have come inside our facility via a feeder circuit that goes outdoors underground to our facility's gate. What's interesting is that various POE switches throughout the entire building seemed to be affected in that some of their ports they just shut down/off. Rebooting these switches brought everything back to life. It didn't impact anything non-POE, and even then, only impacted some devices. But it was spread across the whole building, across multiple switches. I was just curious if anyone had seen anything similar to this before? Our incoming electrical power has surge suppression, and the power to the switches is all through double conversion UPS, so I'm not quite sure why any of them would have been impacted at all. I'm guessing that the strike had some impact on the electrical ground, but I don't know what we can do to prevent future strikes from causing the same issues. Thoughts?
RE: POE switches and lightning
My first guess would be the lightning was close enough/powerful enough, to send out an EM Pulse which got picked up by the copper going to the devices. This EM Pulse may have been interpreted at the switchport as the device relinquishing power? Had you tried just unplugging one of the devices from Ethernet, and plugging it back in to reset the PoE exchange? Ken Matlock Network Analyst Exempla Healthcare (303) 467-4671 matlo...@exempla.org -Original Message- From: Caleb Tennis [mailto:caleb.ten...@gmail.com] Sent: Thursday, May 13, 2010 9:37 AM To: North American Network Operators Group Subject: POE switches and lightning We had a lightning strike nearby yesterday that looks to have come inside our facility via a feeder circuit that goes outdoors underground to our facility's gate. What's interesting is that various POE switches throughout the entire building seemed to be affected in that some of their ports they just shut down/off. Rebooting these switches brought everything back to life. It didn't impact anything non-POE, and even then, only impacted some devices. But it was spread across the whole building, across multiple switches. I was just curious if anyone had seen anything similar to this before? Our incoming electrical power has surge suppression, and the power to the switches is all through double conversion UPS, so I'm not quite sure why any of them would have been impacted at all. I'm guessing that the strike had some impact on the electrical ground, but I don't know what we can do to prevent future strikes from causing the same issues. Thoughts?
Re: POE switches and lightning
On 5/13/2010 10:36, Caleb Tennis wrote: We had a lightning strike nearby yesterday that looks to have come inside our facility via a feeder circuit that goes outdoors underground to our facility's gate. What's interesting is that various POE switches throughout the entire building seemed to be affected in that some of their ports they just shut down/off. Rebooting these switches brought everything back to life. It didn't impact anything non-POE, and even then, only impacted some devices. But it was spread across the whole building, across multiple switches. I was just curious if anyone had seen anything similar to this before? Our incoming electrical power has surge suppression, and the power to the switches is all through double conversion UPS, so I'm not quite sure why any of them would have been impacted at all. I'm guessing that the strike had some impact on the electrical ground, but I don't know what we can do to prevent future strikes from causing the same issues. Thoughts? I don't know how to account for this in a PoE world, but when I last managed a campus network, we had major issues (particularly in an active-thunder-storm environment) of severe difference in ground-potential between buildings. The only way we could survive was to connect buildings (including free-standing kiosks) with their own grounds using glass. Does anybody make a CAT 5 1-to-1 isolation transformer? -- Somebody should have said: A democracy is two wolves and a lamb voting on what to have for dinner. Freedom under a constitutional republic is a well armed lamb contesting the vote. Requiescas in pace o email Ex turpi causa non oritur actio Eppure si rinfresca ICBM Targeting Information: http://tinyurl.com/4sqczs http://tinyurl.com/7tp8ml
Re: POE switches and lightning
On 05/13/2010 12:19 PM, Larry Sheldon wrote: On 5/13/2010 10:36, Caleb Tennis wrote: We had a lightning strike nearby yesterday that looks to have come inside our facility via a feeder circuit that goes outdoors underground to our facility's gate. What's interesting is that various POE switches throughout the entire building seemed to be affected in that some of their ports they just shut down/off. Rebooting these switches brought everything back to life. It didn't impact anything non-POE, and even then, only impacted some devices. But it was spread across the whole building, across multiple switches. I was just curious if anyone had seen anything similar to this before? Our incoming electrical power has surge suppression, and the power to the switches is all through double conversion UPS, so I'm not quite sure why any of them would have been impacted at all. I'm guessing that the strike had some impact on the electrical ground, but I don't know what we can do to prevent future strikes from causing the same issues. Thoughts? I don't know how to account for this in a PoE world, but when I last managed a campus network, we had major issues (particularly in an active-thunder-storm environment) of severe difference in ground-potential between buildings. Cat 5 has isolation transformers in or just behind each jack. However, in most equipment the grounds aren't really isolated, and in the case of POE they (mostly) aren't at all. Lightning likes to do interesting things. It can induce a 20kv per few feet gradient (or more) across the ground mesh of a power substation (like 4/0 wire in a mesh of 4 foot squares or so; normally more complicated than that since it has to clear equipment etc...). It likes to eat power supplies in well-grounded equipment and leave cheaper stuff alone. It can hit an antenna, leave the receiver completely intact, and fry the power supply of the next box over. We tended to lose either fluorescent ballasts or the thermostat transformer in our furnace when I lived in an active ham's house in Alabama, the radios tended to live. (you should have seen his coax entry panel (1/4 inch copper sheet, grounded outside)), and stuff got manually disconnected from both antennas and power when a storm was expected (every afternoon :-). It wouldn't surprise me if the first answer was right and either the ground pulse or EMP reset the safety switches in the POE feeders. -- Pete
Re: POE switches and lightning
While the equipment may well be affected by an EM pulse, if the gear returns to normal after a power cycle, then the equipment vendor didn't do their job fully developing the product. A product should be tested to take such pulses and should recover provided it has not suffered a catastrophic failure (and in fact it should contain sufficient protection to avoid such in most cases). In working on one particular router in the lab some years ago, I was verifying some software functionality and the hardware engineer I was working with reached over my shoulder and used a device that delivered a high voltage spike (simulated lightning) to a 10BaseT network port. After I peeled myself off the ceiling (and he stopped laughing), we set to work figuring out how to get the device to self-reset after such a strike. One component, an Ethernet hub chip, got into a confused state. I was able to detect this in software, so we adjusted the product design so that the software could yank the hub chip's reset line. It's unfortunate that products, both hardware and software, receive minimal quality testing these days. Guess it's not a surprise, since buyers seemed to prefer products that were quick to market, with lots of bugs, rather than reliability and resilience. On May 13, 2010, at 12:39 PM, Pete Carah wrote: On 05/13/2010 12:19 PM, Larry Sheldon wrote: On 5/13/2010 10:36, Caleb Tennis wrote: We had a lightning strike nearby yesterday that looks to have come inside our facility via a feeder circuit that goes outdoors underground to our facility's gate. What's interesting is that various POE switches throughout the entire building seemed to be affected in that some of their ports they just shut down/off. Rebooting these switches brought everything back to life. It didn't impact anything non-POE, and even then, only impacted some devices. But it was spread across the whole building, across multiple switches. I was just curious if anyone had seen anything similar to this before? Our incoming electrical power has surge suppression, and the power to the switches is all through double conversion UPS, so I'm not quite sure why any of them would have been impacted at all. I'm guessing that the strike had some impact on the electrical ground, but I don't know what we can do to prevent future strikes from causing the same issues. Thoughts? I don't know how to account for this in a PoE world, but when I last managed a campus network, we had major issues (particularly in an active-thunder-storm environment) of severe difference in ground-potential between buildings. Cat 5 has isolation transformers in or just behind each jack. However, in most equipment the grounds aren't really isolated, and in the case of POE they (mostly) aren't at all. Lightning likes to do interesting things. It can induce a 20kv per few feet gradient (or more) across the ground mesh of a power substation (like 4/0 wire in a mesh of 4 foot squares or so; normally more complicated than that since it has to clear equipment etc...). It likes to eat power supplies in well-grounded equipment and leave cheaper stuff alone. It can hit an antenna, leave the receiver completely intact, and fry the power supply of the next box over. We tended to lose either fluorescent ballasts or the thermostat transformer in our furnace when I lived in an active ham's house in Alabama, the radios tended to live. (you should have seen his coax entry panel (1/4 inch copper sheet, grounded outside)), and stuff got manually disconnected from both antennas and power when a storm was expected (every afternoon :-). It wouldn't surprise me if the first answer was right and either the ground pulse or EMP reset the safety switches in the POE feeders. -- Pete
RE: POE switches and lightning
About a month ago, we had a lightning strike near our main campus. We lost one POE Cisco 3560 completely (apparently blown power supply), and in a separate but nearby building, another 3560 lost the ability to deliver POE, but continued to operate as a switch. Both had to be replaced. Both were on wiring closet type UPS'es with surge suppression, and those were unaffected. Mark -Original Message- From: Caleb Tennis [mailto:caleb.ten...@gmail.com] Sent: Thursday, May 13, 2010 10:37 AM To: North American Network Operators Group Subject: POE switches and lightning We had a lightning strike nearby yesterday that looks to have come inside our facility via a feeder circuit that goes outdoors underground to our facility's gate. What's interesting is that various POE switches throughout the entire building seemed to be affected in that some of their ports they just shut down/off. Rebooting these switches brought everything back to life. It didn't impact anything non-POE, and even then, only impacted some devices. But it was spread across the whole building, across multiple switches. I was just curious if anyone had seen anything similar to this before? Our incoming electrical power has surge suppression, and the power to the switches is all through double conversion UPS, so I'm not quite sure why any of them would have been impacted at all. I'm guessing that the strike had some impact on the electrical ground, but I don't know what we can do to prevent future strikes from causing the same issues. Thoughts? Confidentiality Statement: The documents accompanying this transmission contain confidential information that is legally privileged. This information is intended only for the use of the individuals or entities listed above. If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution, or action taken in reliance on the contents of these documents is strictly prohibited. If you have received this information in error, please notify the sender immediately and arrange for the return or destruction of these documents.
Re: POE switches and lightning
Caleb Tennis wrote: We had a lightning strike nearby yesterday that looks to have come inside our facility via a feeder circuit that goes outdoors underground to our facility's gate. What's interesting is that various POE switches throughout the entire building seemed to be affected in that some of their ports they just shut down/off. Rebooting these switches brought everything back to life. It didn't impact anything non-POE, and even then, only impacted some devices. But it was spread across the whole building, across multiple switches. I was just curious if anyone had seen anything similar to this before? Our incoming electrical power has surge suppression, and the power to the switches is all through double conversion UPS, so I'm not quite sure why any of them would have been impacted at all. I'm guessing that the strike had some impact on the electrical ground, but I don't know what we can do to prevent future strikes from causing the same issues. Thoughts? I use these on any cable that leaves my building. http://www.amazon.com/APC-PNET1GB-ProtectNet-Standalone-Protector/dp/B000BKUSS8 It seems to play well with PoE (I put mine before the injector), and also works well with T1s and POTS. -Paul
Re: POE switches and lightning
On May 13, 2010, at 2:24 04PM, Daniel Senie wrote: While the equipment may well be affected by an EM pulse, if the gear returns to normal after a power cycle, then the equipment vendor didn't do their job fully developing the product. A product should be tested to take such pulses and should recover provided it has not suffered a catastrophic failure (and in fact it should contain sufficient protection to avoid such in most cases). In working on one particular router in the lab some years ago, I was verifying some software functionality and the hardware engineer I was working with reached over my shoulder and used a device that delivered a high voltage spike (simulated lightning) to a 10BaseT network port. After I peeled myself off the ceiling (and he stopped laughing), we set to work figuring out how to get the device to self-reset after such a strike. One component, an Ethernet hub chip, got into a confused state. I was able to detect this in software, so we adjusted the product design so that the software could yank the hub chip's reset line. It's unfortunate that products, both hardware and software, receive minimal quality testing these days. Guess it's not a surprise, since buyers seemed to prefer products that were quick to market, with lots of bugs, rather than reliability and resilience. It's not just a matter of these days -- lightning is awfully hard to deal with, because of how quirky the real-world behavior can be. I had to deal with this a lot in the 1970s on RS-232 lines -- we could never predict what would get fried. Of course, there was also a ground strikes very near my apartment, where the induced current tripped a circuit breaker, blew out a couple of lightbulbs, and and came in through the cable TV line to fry the cable box, fry the impedance-matching transformer, and fry the RF input stage on the television... --Steve Bellovin, http://www.cs.columbia.edu/~smb
Re: POE switches and lightning
On 05/13/2010 02:52 PM, Steven Bellovin wrote: On May 13, 2010, at 2:24 04PM, Daniel Senie wrote: While the equipment may well be affected by an EM pulse, if the gear returns to normal after a power cycle, then the equipment vendor didn't do their job fully developing the product. A product should be tested to take such pulses and should recover provided it has not suffered a catastrophic failure (and in fact it should contain sufficient protection to avoid such in most cases). In working on one particular router in the lab some years ago, I was verifying some software functionality and the hardware engineer I was working with reached over my shoulder and used a device that delivered a high voltage spike (simulated lightning) to a 10BaseT network port. After I peeled myself off the ceiling (and he stopped laughing), we set to work figuring out how to get the device to self-reset after such a strike. One component, an Ethernet hub chip, got into a confused state. I was able to detect this in software, so we adjusted the product design so that the software could yank the hub chip's reset line. Luck. I've needed that kind of reset a few times... It's unfortunate that products, both hardware and software, receive minimal quality testing these days. Guess it's not a surprise, since buyers seemed to prefer products that were quick to market, with lots of bugs, rather than reliability and resilience. That is certainly true (and not entirely modern; you can read about that problem in old roman literature. When was Zen and the art of motorcycle maintainance written? - 1970's); however it is nearly impossible to protect well against close-by lightning. It's not just a matter of these days -- lightning is awfully hard to deal with, because of how quirky the real-world behavior can be. I had to deal with this a lot in the 1970s on RS-232 lines -- we could never predict what would get fried. Of course, there was also a ground strikes very near my apartment, where the induced current tripped a circuit breaker, blew out a couple of lightbulbs, and and came in through the cable TV line to fry the cable box, fry the impedance-matching transformer, and fry the RF input stage on the television... I can second Steve in spades; I used to work for the power company in Alabama... There you learn a LOT more than you ever wanted to know about lightning. Consider that one hit can destroy the inside of a 10Mw 66kv-12kv distribution transformer (I actually saw the strike involved; it was less than a mile from my apartment at the time, and dropped power to me; the apt was fed from an entirely different company... My power came back in a few minutes; the other load took almost a week (they had a redundant feed; it was a hospital, but they ran in a low-power mode till a BIG crane and big lo-boy truck came with another transformer)); how are you going to protect any computer from *that*... -- Pete --Steve Bellovin, http://www.cs.columbia.edu/~smb
Dark fiber / transport in Virginia
All, I am interested in finding out about dark fiber / transport resources along I-81 or I-64 in the western part of Virginia. I’d like to find a transport provider that could connect to a “meet me” room in either Roanoke, Charlottesville, Richmond, DC, or even Charleston, WV. I’m trying to price out alternatives to the telco transport and data delivery model and I’m new to the Virginia market. Thanks for any help that you can offer! -Mike
RE: Dark fiber / transport in Virginia
You might try the cable operator Charter.com, I think believe they operate in that area. Gary -Original Message- From: Courtney, Mike [mailto:mcourt...@wlu.edu] Sent: Thursday, May 13, 2010 5:23 PM To: nanog@nanog.org Subject: Dark fiber / transport in Virginia All, I am interested in finding out about dark fiber / transport resources along I-81 or I-64 in the western part of Virginia. I'd like to find a transport provider that could connect to a meet me room in either Roanoke, Charlottesville, Richmond, DC, or even Charleston, WV. I'm trying to price out alternatives to the telco transport and data delivery model and I'm new to the Virginia market. Thanks for any help that you can offer! -Mike
Re: Dark fiber / transport in Virginia
Abovenet/Verizon has fibers in those paths. mehmet On 5/13/10 2:23 PM, Courtney, Mike mcourt...@wlu.edu wrote: All, I am interested in finding out about dark fiber / transport resources along I-81 or I-64 in the western part of Virginia. I¹d like to find a transport provider that could connect to a ³meet me² room in either Roanoke, Charlottesville, Richmond, DC, or even Charleston, WV. I¹m trying to price out alternatives to the telco transport and data delivery model and I¹m new to the Virginia market. Thanks for any help that you can offer! -Mike
ipv6 transit over tunneled connection
Hello, We're in the early stage of planning ipv6 deployment - learning/labbing/experimenting/etc. We've got to the point when we're also planning to request initial ipv6 allocation from ARIN. So I wonder what ipv6 transit options I have if my upstreams do not support native ipv6 connectivity? I see Hurricane Electric tunnel broker BGP tunnel. Is there anything else? Either free or commercial? Thanks, Michael
Re: ipv6 transit over tunneled connection
Occaid will generally transit you via two tunnels to their endpoints. I used them for a year with zero issues in addition to an HE tunnel. -Jack Carrozzo On Thu, May 13, 2010 at 6:18 PM, Michael Ulitskiy mulits...@acedsl.comwrote: Hello, We're in the early stage of planning ipv6 deployment - learning/labbing/experimenting/etc. We've got to the point when we're also planning to request initial ipv6 allocation from ARIN. So I wonder what ipv6 transit options I have if my upstreams do not support native ipv6 connectivity? I see Hurricane Electric tunnel broker BGP tunnel. Is there anything else? Either free or commercial? Thanks, Michael
Re: Abbott dumps NBN from budget reply
Sorry guys - meant to send this to AusNOG. Matt On Fri, May 14, 2010 at 10:09 AM, Matt Shadbolt matt.shadb...@gmail.comwrote: Did anyone hear about this? Describing the National Broadband Network as a $43 billion white elephant, he confirmed the Coalition would not go ahead with the program Prime Minster Kevin Rudd argues will deliver faster internet speeds across the country. http://www.theaustralian.com.au/in-depth/budget/tony-abbott-would-slash-public-service-budget-reply/story-e6frgd66-1225866272354 So, vote for an internet filter or vote for no NBN? Matt.
Re: ipv6 transit over tunneled connection
On Thu, May 13, 2010 at 6:18 PM, Michael Ulitskiy mulits...@acedsl.com wrote: Hello, We're in the early stage of planning ipv6 deployment - learning/labbing/experimenting/etc. We've got to the point when we're also planning to request initial ipv6 allocation from ARIN. So I wonder what ipv6 transit options I have if my upstreams do not support native ipv6 connectivity? I see Hurricane Electric tunnel broker BGP tunnel. Is there anything else? Either free or commercial? 1) see gblx/ntt/sprint/twt/vzb for transit-v6 2) tunnel inside your domain (your control, your MTU issues, your alternate pathing of tunnels vs pipe) 3) don't tunnel beyond your borders, really just don't tunnels are bad, always. -chris Thanks, Michael
RE: BGP and convergence time
What about IP SLA with some EEM? This link may give you some ideas: http://blog.ioshints.info/2008/01/ospf-default-route-based-on-ip-sla.html Frank -Original Message- From: Jay Nakamura [mailto:zeusda...@gmail.com] Sent: Tuesday, May 11, 2010 1:35 PM To: NANOG Subject: BGP and convergence time So, we have two upstreams, both coming in on Ethernet. One of our switch crashed and rebooted itself. Although we have other paths to egress out the network, because the router's Ethernet interface didn't go down, our router's BGP didn't realize the neighbor was down until default BGP timeout was reached. Our upstream connectivity was out for couple minutes. I am looking for ways to detect neighbor being down faster so traffic can be re-routed faster. I can do BFD internally but the issue is how the upstream is going to detect the outage and stop routing our traffic to that downed link. I have asked both of my upstreams and one said they don't do anything like that, second upstream I am still waiting on the answer. My question is, do other carriers do BFD or any other means to detect the neighbor being down faster than normal BGP will allow? (Both upstreams are major telcos [ATT and Qwest], so I think they are less flexible than some others.) Or, has anyone succeeded in getting something done with those two carriers? Thanks!
RE: Dial Concentrators - TNT / APX8000 R.I.P.
Thirty percent? If no access includes financial means or developed interest, that may be true, but 99% of all zip codes have at least person with internet access. And the FCC has stated that 95 percent of Americans, or 290 million people, have terrestrial broadband access http://blog.zcorum.com/2010/03/national-broadband-plan-the-debate-begins/. Frank -Original Message- From: Curtis Maurand [mailto:cmaur...@xyonet.com] Sent: Tuesday, May 11, 2010 10:51 AM To: nanog@nanog.org Subject: Re: Dial Concentrators - TNT / APX8000 R.I.P. 30% of all people in the US (110 million) have no access to broadband. Large areas of my state have no access to broadband because its rural (Maine). Aastra CVX (it used to be a Nortel product.) --Curtis On 5/11/2010 11:29 AM, Joe Abley wrote: On 2010-05-11, at 11:08, Leo Bicknell wrote: There comes a time when the old tech just doesn't make sense, even if a small customer base still wants it. There will also no doubt continue to be many customers for whom dial is the only option. It's not long ago that I lived in such a house, deceptively close to the outskirts of town but in terms of wire distance and load coils it might as well have been on the moon. The house was in a wireless dead zone by a river, there was no cable, and the only line of sight to another structure was through several acres of 2.4GHz-absorbing trees. The further you move away from urban centres, the easier it is to find examples of this. Joe
Re: Dial Concentrators - TNT / APX8000 R.I.P.
On 2010-05-13 19:43, Frank Bulk wrote: Thirty percent? If no access includes financial means or developed interest, that may be true, but 99% of all zip codes have at least person with internet access. And the FCC has stated that 95 percent of Americans, or 290 million people, have terrestrial broadband access http://blog.zcorum.com/2010/03/national-broadband-plan-the-debate-begins/. Frank -Original Message- From: Curtis Maurand [mailto:cmaur...@xyonet.com] Sent: Tuesday, May 11, 2010 10:51 AM To: nanog@nanog.org Subject: Re: Dial Concentrators - TNT / APX8000 R.I.P. 30% of all people in the US (110 million) have no access to broadband. Large areas of my state have no access to broadband because its rural (Maine). The rural population represented 20.7% of the us population in the 2000 census. about 70% of the US population is concentrated in about 2% of the land area. Aastra CVX (it used to be a Nortel product.) --Curtis On 5/11/2010 11:29 AM, Joe Abley wrote: On 2010-05-11, at 11:08, Leo Bicknell wrote: There comes a time when the old tech just doesn't make sense, even if a small customer base still wants it. There will also no doubt continue to be many customers for whom dial is the only option. It's not long ago that I lived in such a house, deceptively close to the outskirts of town but in terms of wire distance and load coils it might as well have been on the moon. The house was in a wireless dead zone by a river, there was no cable, and the only line of sight to another structure was through several acres of 2.4GHz-absorbing trees. The further you move away from urban centres, the easier it is to find examples of this. Joe