[ https://issues.apache.org/jira/browse/SPARK-926?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen resolved SPARK-926. ----------------------------- Resolution: Duplicate Going to make this one the duplicate since SPARK-5403 has an active PR. > spark_ec2 script when ssh/scp-ing should pipe UserknowHostFile to /dev/null > --------------------------------------------------------------------------- > > Key: SPARK-926 > URL: https://issues.apache.org/jira/browse/SPARK-926 > Project: Spark > Issue Type: New Feature > Components: EC2 > Affects Versions: 0.8.0 > Reporter: Shay Seng > Priority: Trivial > > The know host file in the local machine gets all kinds of crap after a few > cluster launches. When SSHing, or SCPing, please add "-o > UserKnowHostFile=/dev/null" > Also remove the -t option from SSH, and only add in when necessary - to > reduce chatter on console. > e.g. > # Copy a file to a given host through scp, throwing an exception if scp fails > def scp(host, opts, local_file, dest_file): > subprocess.check_call( > "scp -q -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null -i > %s '%s' '%s@%s:%s'" % > (opts.identity_file, local_file, opts.user, host, dest_file), > shell=True) > # Run a command on a host through ssh, retrying up to two times > # and then throwing an exception if ssh continues to fail. > def ssh(host, opts, command, sshopts=""): > tries = 0 > while True: > try: > # removed -t option from ssh command, not sure why it is required all > the time. > return subprocess.check_call( > "ssh %s -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null > -i %s %s@%s '%s'" % > (sshopts, opts.identity_file, opts.user, host, command), shell=True) > except subprocess.CalledProcessError as e: > if (tries > 2): > raise e > print "Couldn't connect to host {0}, waiting 30 seconds".format(e) > time.sleep(30) > tries = tries + 1 -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org